Nov 29 05:27:55 crc systemd[1]: Starting Kubernetes Kubelet... Nov 29 05:27:55 crc restorecon[4566]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 05:27:55 crc restorecon[4566]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 05:27:55 crc restorecon[4566]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 29 05:27:55 crc kubenswrapper[4594]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 29 05:27:55 crc kubenswrapper[4594]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 29 05:27:55 crc kubenswrapper[4594]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 29 05:27:55 crc kubenswrapper[4594]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 29 05:27:55 crc kubenswrapper[4594]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 29 05:27:55 crc kubenswrapper[4594]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.950388 4594 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953132 4594 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953152 4594 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953158 4594 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953161 4594 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953164 4594 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953169 4594 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953172 4594 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953176 4594 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953181 4594 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953186 4594 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953190 4594 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953196 4594 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953201 4594 feature_gate.go:330] unrecognized feature gate: Example Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953205 4594 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953208 4594 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953211 4594 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953215 4594 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953218 4594 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953221 4594 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953225 4594 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953228 4594 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953231 4594 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953236 4594 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953241 4594 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953245 4594 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953249 4594 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953265 4594 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953269 4594 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953273 4594 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953277 4594 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953288 4594 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953292 4594 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953296 4594 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953300 4594 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953305 4594 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953308 4594 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953312 4594 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953316 4594 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953320 4594 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953323 4594 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953327 4594 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953330 4594 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953333 4594 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953336 4594 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953340 4594 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953343 4594 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953346 4594 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953350 4594 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953354 4594 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953358 4594 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953361 4594 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953365 4594 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953370 4594 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953373 4594 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953376 4594 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953380 4594 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953383 4594 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953386 4594 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953389 4594 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953392 4594 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953396 4594 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953399 4594 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953401 4594 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953407 4594 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953411 4594 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953414 4594 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953417 4594 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953419 4594 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953423 4594 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953427 4594 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.953431 4594 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954056 4594 flags.go:64] FLAG: --address="0.0.0.0" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954070 4594 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954079 4594 flags.go:64] FLAG: --anonymous-auth="true" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954085 4594 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954091 4594 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954095 4594 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954100 4594 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954106 4594 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954110 4594 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954114 4594 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954119 4594 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954123 4594 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954127 4594 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954131 4594 flags.go:64] FLAG: --cgroup-root="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954134 4594 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954138 4594 flags.go:64] FLAG: --client-ca-file="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954141 4594 flags.go:64] FLAG: --cloud-config="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954144 4594 flags.go:64] FLAG: --cloud-provider="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954147 4594 flags.go:64] FLAG: --cluster-dns="[]" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954152 4594 flags.go:64] FLAG: --cluster-domain="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954155 4594 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954159 4594 flags.go:64] FLAG: --config-dir="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954162 4594 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954166 4594 flags.go:64] FLAG: --container-log-max-files="5" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954172 4594 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954176 4594 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954179 4594 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954183 4594 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954187 4594 flags.go:64] FLAG: --contention-profiling="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954190 4594 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954194 4594 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954198 4594 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954201 4594 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954206 4594 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954209 4594 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954213 4594 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954216 4594 flags.go:64] FLAG: --enable-load-reader="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954220 4594 flags.go:64] FLAG: --enable-server="true" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954223 4594 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954229 4594 flags.go:64] FLAG: --event-burst="100" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954233 4594 flags.go:64] FLAG: --event-qps="50" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954236 4594 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954242 4594 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954262 4594 flags.go:64] FLAG: --eviction-hard="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954269 4594 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954273 4594 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954277 4594 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954280 4594 flags.go:64] FLAG: --eviction-soft="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954284 4594 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954288 4594 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954291 4594 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954296 4594 flags.go:64] FLAG: --experimental-mounter-path="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954300 4594 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954303 4594 flags.go:64] FLAG: --fail-swap-on="true" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954308 4594 flags.go:64] FLAG: --feature-gates="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954313 4594 flags.go:64] FLAG: --file-check-frequency="20s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954318 4594 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954322 4594 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954327 4594 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954332 4594 flags.go:64] FLAG: --healthz-port="10248" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954336 4594 flags.go:64] FLAG: --help="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954340 4594 flags.go:64] FLAG: --hostname-override="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954344 4594 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954348 4594 flags.go:64] FLAG: --http-check-frequency="20s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954352 4594 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954356 4594 flags.go:64] FLAG: --image-credential-provider-config="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954359 4594 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954363 4594 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954367 4594 flags.go:64] FLAG: --image-service-endpoint="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954370 4594 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954374 4594 flags.go:64] FLAG: --kube-api-burst="100" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954378 4594 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954382 4594 flags.go:64] FLAG: --kube-api-qps="50" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954385 4594 flags.go:64] FLAG: --kube-reserved="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954389 4594 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954393 4594 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954396 4594 flags.go:64] FLAG: --kubelet-cgroups="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954400 4594 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954403 4594 flags.go:64] FLAG: --lock-file="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954407 4594 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954411 4594 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954415 4594 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954420 4594 flags.go:64] FLAG: --log-json-split-stream="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954425 4594 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954429 4594 flags.go:64] FLAG: --log-text-split-stream="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954432 4594 flags.go:64] FLAG: --logging-format="text" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954436 4594 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954440 4594 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954443 4594 flags.go:64] FLAG: --manifest-url="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954446 4594 flags.go:64] FLAG: --manifest-url-header="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954452 4594 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954455 4594 flags.go:64] FLAG: --max-open-files="1000000" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954460 4594 flags.go:64] FLAG: --max-pods="110" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954463 4594 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954466 4594 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954470 4594 flags.go:64] FLAG: --memory-manager-policy="None" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954473 4594 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954477 4594 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954480 4594 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954484 4594 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954493 4594 flags.go:64] FLAG: --node-status-max-images="50" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954497 4594 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954501 4594 flags.go:64] FLAG: --oom-score-adj="-999" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954504 4594 flags.go:64] FLAG: --pod-cidr="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954507 4594 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954514 4594 flags.go:64] FLAG: --pod-manifest-path="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954518 4594 flags.go:64] FLAG: --pod-max-pids="-1" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954522 4594 flags.go:64] FLAG: --pods-per-core="0" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954525 4594 flags.go:64] FLAG: --port="10250" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954529 4594 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954532 4594 flags.go:64] FLAG: --provider-id="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954536 4594 flags.go:64] FLAG: --qos-reserved="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954540 4594 flags.go:64] FLAG: --read-only-port="10255" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954543 4594 flags.go:64] FLAG: --register-node="true" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954546 4594 flags.go:64] FLAG: --register-schedulable="true" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954551 4594 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954559 4594 flags.go:64] FLAG: --registry-burst="10" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954562 4594 flags.go:64] FLAG: --registry-qps="5" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954566 4594 flags.go:64] FLAG: --reserved-cpus="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954570 4594 flags.go:64] FLAG: --reserved-memory="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954574 4594 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954578 4594 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954582 4594 flags.go:64] FLAG: --rotate-certificates="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954585 4594 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954589 4594 flags.go:64] FLAG: --runonce="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954592 4594 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954596 4594 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954599 4594 flags.go:64] FLAG: --seccomp-default="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954603 4594 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954606 4594 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954609 4594 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954613 4594 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954616 4594 flags.go:64] FLAG: --storage-driver-password="root" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954620 4594 flags.go:64] FLAG: --storage-driver-secure="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954623 4594 flags.go:64] FLAG: --storage-driver-table="stats" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954626 4594 flags.go:64] FLAG: --storage-driver-user="root" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954631 4594 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954634 4594 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954638 4594 flags.go:64] FLAG: --system-cgroups="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954642 4594 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954648 4594 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954653 4594 flags.go:64] FLAG: --tls-cert-file="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954656 4594 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954661 4594 flags.go:64] FLAG: --tls-min-version="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954665 4594 flags.go:64] FLAG: --tls-private-key-file="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954669 4594 flags.go:64] FLAG: --topology-manager-policy="none" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954672 4594 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954676 4594 flags.go:64] FLAG: --topology-manager-scope="container" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954680 4594 flags.go:64] FLAG: --v="2" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954686 4594 flags.go:64] FLAG: --version="false" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954692 4594 flags.go:64] FLAG: --vmodule="" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954697 4594 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.954702 4594 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954792 4594 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954797 4594 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954801 4594 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954805 4594 feature_gate.go:330] unrecognized feature gate: Example Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954808 4594 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954812 4594 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954815 4594 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954819 4594 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954822 4594 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954826 4594 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954829 4594 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954833 4594 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954836 4594 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954839 4594 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954843 4594 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954847 4594 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954851 4594 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954855 4594 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954861 4594 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954865 4594 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954868 4594 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954873 4594 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954877 4594 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954881 4594 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954884 4594 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954887 4594 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954892 4594 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954896 4594 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954900 4594 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954903 4594 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954907 4594 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954910 4594 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954913 4594 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954917 4594 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954920 4594 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954924 4594 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954927 4594 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954930 4594 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954934 4594 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954937 4594 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954940 4594 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954944 4594 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954948 4594 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954951 4594 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954954 4594 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954958 4594 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954961 4594 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954964 4594 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954967 4594 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954970 4594 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954973 4594 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954977 4594 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954980 4594 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954983 4594 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954986 4594 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954989 4594 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954992 4594 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954995 4594 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.954998 4594 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.955001 4594 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.955004 4594 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.955007 4594 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.955010 4594 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.955013 4594 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.955016 4594 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.955019 4594 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.955022 4594 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.955025 4594 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.955028 4594 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.955030 4594 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.955034 4594 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.955045 4594 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.962148 4594 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.962181 4594 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962244 4594 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962288 4594 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962296 4594 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962300 4594 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962305 4594 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962311 4594 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962316 4594 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962320 4594 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962324 4594 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962327 4594 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962331 4594 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962334 4594 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962338 4594 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962341 4594 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962345 4594 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962348 4594 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962351 4594 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962355 4594 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962359 4594 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962362 4594 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962366 4594 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962369 4594 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962372 4594 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962375 4594 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962378 4594 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962383 4594 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962386 4594 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962390 4594 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962393 4594 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962396 4594 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962400 4594 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962403 4594 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962406 4594 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962409 4594 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962414 4594 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962418 4594 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962423 4594 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962429 4594 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962435 4594 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962441 4594 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962446 4594 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962451 4594 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962456 4594 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962461 4594 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962466 4594 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962471 4594 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962475 4594 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962478 4594 feature_gate.go:330] unrecognized feature gate: Example Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962483 4594 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962487 4594 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962491 4594 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962494 4594 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962498 4594 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962502 4594 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962507 4594 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962511 4594 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962514 4594 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962518 4594 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962522 4594 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962526 4594 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962529 4594 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962533 4594 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962537 4594 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962540 4594 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962543 4594 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962546 4594 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962549 4594 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962553 4594 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962556 4594 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962559 4594 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962562 4594 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.962570 4594 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962701 4594 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962707 4594 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962711 4594 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962715 4594 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962718 4594 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962721 4594 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962725 4594 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962730 4594 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962734 4594 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962737 4594 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962741 4594 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962745 4594 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962748 4594 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962751 4594 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962755 4594 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962758 4594 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962761 4594 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962764 4594 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962775 4594 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962778 4594 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962782 4594 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962786 4594 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962789 4594 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962793 4594 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962796 4594 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962800 4594 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962803 4594 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962806 4594 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962809 4594 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962812 4594 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962815 4594 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962818 4594 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962821 4594 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962823 4594 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962827 4594 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962831 4594 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962835 4594 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962839 4594 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962842 4594 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962847 4594 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962852 4594 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962855 4594 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962860 4594 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962864 4594 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962868 4594 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962871 4594 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962875 4594 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962879 4594 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962883 4594 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962887 4594 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962890 4594 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962894 4594 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962897 4594 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962900 4594 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962903 4594 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962906 4594 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962909 4594 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962912 4594 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962915 4594 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962918 4594 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962921 4594 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962924 4594 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962927 4594 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962930 4594 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962933 4594 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962936 4594 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962940 4594 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962943 4594 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962946 4594 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962949 4594 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 29 05:27:55 crc kubenswrapper[4594]: W1129 05:27:55.962952 4594 feature_gate.go:330] unrecognized feature gate: Example Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.962956 4594 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.963140 4594 server.go:940] "Client rotation is on, will bootstrap in background" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.967580 4594 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.967668 4594 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.968531 4594 server.go:997] "Starting client certificate rotation" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.968561 4594 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.969147 4594 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-12 08:29:09.65523137 +0000 UTC Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.969236 4594 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 315h1m13.68599788s for next certificate rotation Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.980544 4594 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.981948 4594 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 29 05:27:55 crc kubenswrapper[4594]: I1129 05:27:55.997479 4594 log.go:25] "Validated CRI v1 runtime API" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.015859 4594 log.go:25] "Validated CRI v1 image API" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.017369 4594 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.021169 4594 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-29-05-24-22-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.021208 4594 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.033510 4594 manager.go:217] Machine: {Timestamp:2025-11-29 05:27:56.032148831 +0000 UTC m=+0.272658071 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445404 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:64455698-ce5a-4229-a093-dc9c12114354 BootID:278d67da-c900-4140-8f95-1590a0940f46 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:6a:09:1b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:6a:09:1b Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:10:13:be Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:9b:78:9c Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:62:60:e9 Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:8d:75:77 Speed:-1 Mtu:1436} {Name:eth10 MacAddress:16:94:cc:a3:76:85 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:2e:a1:9a:8e:2b:fc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.033678 4594 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.033758 4594 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.034410 4594 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.034574 4594 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.034612 4594 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.034816 4594 topology_manager.go:138] "Creating topology manager with none policy" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.034826 4594 container_manager_linux.go:303] "Creating device plugin manager" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.035153 4594 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.035183 4594 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.035310 4594 state_mem.go:36] "Initialized new in-memory state store" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.035395 4594 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.036959 4594 kubelet.go:418] "Attempting to sync node with API server" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.036980 4594 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.037002 4594 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.037014 4594 kubelet.go:324] "Adding apiserver pod source" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.037026 4594 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.039615 4594 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.040380 4594 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 29 05:27:56 crc kubenswrapper[4594]: W1129 05:27:56.041590 4594 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.120:6443: connect: connection refused Nov 29 05:27:56 crc kubenswrapper[4594]: W1129 05:27:56.041613 4594 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.120:6443: connect: connection refused Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.041689 4594 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.120:6443: connect: connection refused" logger="UnhandledError" Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.041707 4594 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.120:6443: connect: connection refused" logger="UnhandledError" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.041731 4594 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.042819 4594 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.042843 4594 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.042851 4594 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.042858 4594 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.042875 4594 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.042882 4594 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.042889 4594 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.042901 4594 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.042909 4594 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.042916 4594 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.042948 4594 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.042954 4594 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.043311 4594 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.043774 4594 server.go:1280] "Started kubelet" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.044693 4594 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.045154 4594 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 29 05:27:56 crc systemd[1]: Started Kubernetes Kubelet. Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.045974 4594 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.046047 4594 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.120:6443: connect: connection refused Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.046979 4594 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.047072 4594 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.047066 4594 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 09:46:08.847657481 +0000 UTC Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.047118 4594 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 868h18m12.800548305s for next certificate rotation Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.047148 4594 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.047158 4594 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.047284 4594 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.047368 4594 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 29 05:27:56 crc kubenswrapper[4594]: W1129 05:27:56.048016 4594 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.120:6443: connect: connection refused Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.048100 4594 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.120:6443: connect: connection refused" logger="UnhandledError" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.048210 4594 factory.go:55] Registering systemd factory Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.048314 4594 factory.go:221] Registration of the systemd container factory successfully Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.048319 4594 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" interval="200ms" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.048795 4594 factory.go:153] Registering CRI-O factory Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.048825 4594 factory.go:221] Registration of the crio container factory successfully Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.048899 4594 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.048941 4594 factory.go:103] Registering Raw factory Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.048963 4594 manager.go:1196] Started watching for new ooms in manager Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.049592 4594 manager.go:319] Starting recovery of all containers Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.052777 4594 server.go:460] "Adding debug handlers to kubelet server" Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.049604 4594 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.25.120:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187c63106263f29a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-29 05:27:56.043735706 +0000 UTC m=+0.284244916,LastTimestamp:2025-11-29 05:27:56.043735706 +0000 UTC m=+0.284244916,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062495 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062545 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062557 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062568 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062580 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062588 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062597 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062605 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062616 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062624 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062634 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062644 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062654 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062668 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062679 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062689 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062700 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062710 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062718 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062726 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062734 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062743 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062752 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062792 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062804 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062815 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062827 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062837 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062846 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062855 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062863 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062873 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062883 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062893 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062902 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062910 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062920 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062929 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062942 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062953 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062965 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062976 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062986 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.062994 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063005 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063014 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063024 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063032 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063042 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063051 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063061 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063070 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063081 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063090 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063100 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063112 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063121 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063130 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063138 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063146 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063155 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063163 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063172 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063180 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063192 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063202 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063211 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063222 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063230 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063240 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063248 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063274 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063282 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.063290 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065267 4594 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065335 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065363 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065392 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065407 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065432 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065447 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065459 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065478 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065492 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065512 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065523 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065534 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065552 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065565 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065583 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065597 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.065755 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.067710 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.067749 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.067786 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.067801 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.067821 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.067847 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.067864 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.067912 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.067928 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.067943 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.067963 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.067978 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.067999 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068039 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068072 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068097 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068120 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068143 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068173 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068191 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068213 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068236 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068274 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068295 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068311 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068327 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068349 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068366 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068393 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068410 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068429 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068457 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068477 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068505 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068527 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068545 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068567 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068587 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068611 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068629 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068648 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068672 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068691 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068717 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068733 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068747 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068779 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068796 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068816 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068831 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068847 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068868 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068883 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068907 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068923 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068937 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068959 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068975 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.068996 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069008 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069023 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069044 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069167 4594 manager.go:324] Recovery completed Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069471 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069494 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069562 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069620 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069640 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069663 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069678 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069707 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069727 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069740 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069756 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069779 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069793 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069810 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069823 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069844 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069863 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069887 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069906 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069923 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069940 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069954 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069968 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069983 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.069997 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070012 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070023 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070036 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070068 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070082 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070108 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070118 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070129 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070145 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070158 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070176 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070186 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070195 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070209 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070221 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070233 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070243 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070268 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070284 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070297 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070312 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070323 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070335 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070349 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070359 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070371 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070382 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070397 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070410 4594 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070420 4594 reconstruct.go:97] "Volume reconstruction finished" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.070430 4594 reconciler.go:26] "Reconciler: start to sync state" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.080171 4594 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.080726 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.081799 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.081832 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.081843 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.081946 4594 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.082276 4594 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.082312 4594 kubelet.go:2335] "Starting kubelet main sync loop" Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.082346 4594 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.082813 4594 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.082828 4594 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.082845 4594 state_mem.go:36] "Initialized new in-memory state store" Nov 29 05:27:56 crc kubenswrapper[4594]: W1129 05:27:56.083530 4594 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.120:6443: connect: connection refused Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.083601 4594 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.120:6443: connect: connection refused" logger="UnhandledError" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.088596 4594 policy_none.go:49] "None policy: Start" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.089175 4594 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.089201 4594 state_mem.go:35] "Initializing new in-memory state store" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.128935 4594 manager.go:334] "Starting Device Plugin manager" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.128971 4594 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.128982 4594 server.go:79] "Starting device plugin registration server" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.129251 4594 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.129286 4594 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.129457 4594 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.129538 4594 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.129547 4594 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.135144 4594 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.183222 4594 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.183346 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.184361 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.184400 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.184429 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.184699 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.185044 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.185116 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.185788 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.185823 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.185834 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.185983 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.186074 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.186106 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.186116 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.186168 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.186203 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.186598 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.186644 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.186656 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.186866 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.186917 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.187177 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.187188 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.186931 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.187241 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.187846 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.187868 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.187877 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.188132 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.188154 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.188163 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.188266 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.188439 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.188485 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.188886 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.188908 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.188917 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.189028 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.189052 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.189288 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.189319 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.189338 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.189671 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.189698 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.189711 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.230211 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.230869 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.230912 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.230927 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.230957 4594 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.231460 4594 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.120:6443: connect: connection refused" node="crc" Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.248874 4594 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" interval="400ms" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.272995 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273033 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273061 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273092 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273113 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273134 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273155 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273209 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273230 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273248 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273284 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273347 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273382 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273410 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.273429 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374547 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374584 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374616 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374670 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374674 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374701 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374719 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374722 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374767 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374796 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374803 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374822 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374849 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374876 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374902 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374921 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374920 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374958 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374963 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374828 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374980 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.374992 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.375046 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.375049 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.375096 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.375063 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.375100 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.375122 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.375164 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.375594 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.432472 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.434120 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.434168 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.434204 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.434227 4594 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.434690 4594 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.120:6443: connect: connection refused" node="crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.510223 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.515067 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.534855 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: W1129 05:27:56.544435 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-d569a61061da4012e46827ac48b6cf29452d3f566990aa118069890be85799fe WatchSource:0}: Error finding container d569a61061da4012e46827ac48b6cf29452d3f566990aa118069890be85799fe: Status 404 returned error can't find the container with id d569a61061da4012e46827ac48b6cf29452d3f566990aa118069890be85799fe Nov 29 05:27:56 crc kubenswrapper[4594]: W1129 05:27:56.552151 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-08e8c446effd3f30068e03ceb62e3f521ae3f397b1507e06fec6e7a249d526ef WatchSource:0}: Error finding container 08e8c446effd3f30068e03ceb62e3f521ae3f397b1507e06fec6e7a249d526ef: Status 404 returned error can't find the container with id 08e8c446effd3f30068e03ceb62e3f521ae3f397b1507e06fec6e7a249d526ef Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.560711 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.565885 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 05:27:56 crc kubenswrapper[4594]: W1129 05:27:56.573162 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-214af62aea853022ebc561c260cec1dfdc4fb5dbce66212bea4b4298c91d0db4 WatchSource:0}: Error finding container 214af62aea853022ebc561c260cec1dfdc4fb5dbce66212bea4b4298c91d0db4: Status 404 returned error can't find the container with id 214af62aea853022ebc561c260cec1dfdc4fb5dbce66212bea4b4298c91d0db4 Nov 29 05:27:56 crc kubenswrapper[4594]: W1129 05:27:56.576907 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-82517639410faa2823e6796a4b3bc16c878d80cb80ed46c7c5b37c32b38d77a4 WatchSource:0}: Error finding container 82517639410faa2823e6796a4b3bc16c878d80cb80ed46c7c5b37c32b38d77a4: Status 404 returned error can't find the container with id 82517639410faa2823e6796a4b3bc16c878d80cb80ed46c7c5b37c32b38d77a4 Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.650471 4594 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" interval="800ms" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.835218 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.836617 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.836658 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.836671 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:56 crc kubenswrapper[4594]: I1129 05:27:56.836712 4594 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.837103 4594 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.120:6443: connect: connection refused" node="crc" Nov 29 05:27:56 crc kubenswrapper[4594]: W1129 05:27:56.960001 4594 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.120:6443: connect: connection refused Nov 29 05:27:56 crc kubenswrapper[4594]: E1129 05:27:56.960307 4594 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.120:6443: connect: connection refused" logger="UnhandledError" Nov 29 05:27:57 crc kubenswrapper[4594]: W1129 05:27:57.038629 4594 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.120:6443: connect: connection refused Nov 29 05:27:57 crc kubenswrapper[4594]: E1129 05:27:57.038721 4594 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.120:6443: connect: connection refused" logger="UnhandledError" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.048426 4594 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.120:6443: connect: connection refused Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.087269 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228"} Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.087471 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"08e8c446effd3f30068e03ceb62e3f521ae3f397b1507e06fec6e7a249d526ef"} Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.088999 4594 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699" exitCode=0 Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.089026 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699"} Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.089051 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d569a61061da4012e46827ac48b6cf29452d3f566990aa118069890be85799fe"} Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.089427 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.090416 4594 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ce8859694b66ee60760305d7b1511dfadff6b9e5e3cd3007fd5e24e6c3f76497" exitCode=0 Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.090485 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ce8859694b66ee60760305d7b1511dfadff6b9e5e3cd3007fd5e24e6c3f76497"} Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.090523 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b5981fd82475d6f104d2047e3b265531acc67498a3f7e174746d4163f047b156"} Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.090652 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.091091 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.091149 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.091163 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.091348 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.091376 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.091386 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.092488 4594 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="0d13fc8ae58af027b0d7a9041653871eb5fcf6b23b9c1dd9a38c3bf70333b3b6" exitCode=0 Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.092582 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"0d13fc8ae58af027b0d7a9041653871eb5fcf6b23b9c1dd9a38c3bf70333b3b6"} Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.092653 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"82517639410faa2823e6796a4b3bc16c878d80cb80ed46c7c5b37c32b38d77a4"} Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.092754 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.093823 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.096178 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.096233 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.096270 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.096369 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.096409 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.096421 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.097854 4594 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7" exitCode=0 Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.097903 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7"} Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.097936 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"214af62aea853022ebc561c260cec1dfdc4fb5dbce66212bea4b4298c91d0db4"} Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.098038 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.099094 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.099120 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.099131 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:57 crc kubenswrapper[4594]: E1129 05:27:57.451219 4594 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" interval="1.6s" Nov 29 05:27:57 crc kubenswrapper[4594]: W1129 05:27:57.495053 4594 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.120:6443: connect: connection refused Nov 29 05:27:57 crc kubenswrapper[4594]: E1129 05:27:57.495136 4594 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.120:6443: connect: connection refused" logger="UnhandledError" Nov 29 05:27:57 crc kubenswrapper[4594]: W1129 05:27:57.533822 4594 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.120:6443: connect: connection refused Nov 29 05:27:57 crc kubenswrapper[4594]: E1129 05:27:57.533895 4594 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.120:6443: connect: connection refused" logger="UnhandledError" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.638012 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.640568 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.640597 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.640607 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:57 crc kubenswrapper[4594]: I1129 05:27:57.640642 4594 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 05:27:57 crc kubenswrapper[4594]: E1129 05:27:57.641163 4594 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.120:6443: connect: connection refused" node="crc" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.102600 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021"} Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.102645 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737"} Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.102656 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660"} Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.102655 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.103354 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.103373 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.103380 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.106744 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392"} Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.106772 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad"} Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.106785 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2"} Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.106795 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f"} Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.106804 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000"} Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.106880 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.107607 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.107628 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.107637 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.109421 4594 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b0d8c0d34a6ae17df16c42fdae6834950ba98b2b3506b0793d66245524f8d4b7" exitCode=0 Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.109467 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b0d8c0d34a6ae17df16c42fdae6834950ba98b2b3506b0793d66245524f8d4b7"} Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.109540 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.110160 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.110185 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.110194 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.112456 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9134ef790ac41937f72072758c96622f8e0acd01a4d545c497933f0b2cf17ef2"} Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.112553 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.113196 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.113223 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.113233 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.115296 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fe6aa22ca88502e90dc3ea058d420a497838102af8aa1afc78edc4a52bed8a6b"} Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.115324 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2c7130e6d58c27093d1efa4f7260aba78681945df2d956da7878075ba5eb1d05"} Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.115335 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"95cdf0c76e877968d93b7b86fcdf319cfd2072ceb45903b275dfbe38db4a4e92"} Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.115400 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.116024 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.116053 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:58 crc kubenswrapper[4594]: I1129 05:27:58.116064 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.123360 4594 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9127badd8790425a0d7c5f518a40c6befac17a2c4caed4099d2c39f4172687b9" exitCode=0 Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.123442 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9127badd8790425a0d7c5f518a40c6befac17a2c4caed4099d2c39f4172687b9"} Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.123464 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.123569 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.124209 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.124246 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.124270 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.124943 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.124972 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.124984 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.241946 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.242907 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.242964 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.242976 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:27:59 crc kubenswrapper[4594]: I1129 05:27:59.243018 4594 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.070031 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.070218 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.071390 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.071433 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.071441 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.130173 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"af8274271e0af67ded2a34de10632e52c1d6c9c4056340cf1f9a66a83085d940"} Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.130213 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1bc053726d5990592c6a87dcc7d94a7a8fb593d15260322d65f03ac83c5e37a2"} Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.130224 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"285db16ba35e1c148c060a5e6623a4d2802fb3e6be8ddfab77a40e33909ecc3a"} Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.130233 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dd75bab9235179d876f0a53d6952b71e6b702930b3a204cfe9895cf9fec29f2a"} Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.130240 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1d55facfb116631f2d7c219ec1007cc10a85c3407d58ada31d42ed762d481a2f"} Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.130366 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.131113 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.131172 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.131185 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.507999 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.508115 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.508766 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.508801 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.508812 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.565809 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.816902 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.817296 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.818630 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.818671 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.818683 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:00 crc kubenswrapper[4594]: I1129 05:28:00.888190 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 29 05:28:01 crc kubenswrapper[4594]: I1129 05:28:01.132505 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:01 crc kubenswrapper[4594]: I1129 05:28:01.133473 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:01 crc kubenswrapper[4594]: I1129 05:28:01.133522 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:01 crc kubenswrapper[4594]: I1129 05:28:01.133535 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:02 crc kubenswrapper[4594]: I1129 05:28:02.134923 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:02 crc kubenswrapper[4594]: I1129 05:28:02.135722 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:02 crc kubenswrapper[4594]: I1129 05:28:02.135762 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:02 crc kubenswrapper[4594]: I1129 05:28:02.135771 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:02 crc kubenswrapper[4594]: I1129 05:28:02.985674 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:28:02 crc kubenswrapper[4594]: I1129 05:28:02.985806 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:02 crc kubenswrapper[4594]: I1129 05:28:02.986628 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:02 crc kubenswrapper[4594]: I1129 05:28:02.986670 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:02 crc kubenswrapper[4594]: I1129 05:28:02.986681 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:03 crc kubenswrapper[4594]: I1129 05:28:03.817608 4594 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 29 05:28:03 crc kubenswrapper[4594]: I1129 05:28:03.817660 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 29 05:28:04 crc kubenswrapper[4594]: I1129 05:28:04.035668 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 05:28:04 crc kubenswrapper[4594]: I1129 05:28:04.035862 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:04 crc kubenswrapper[4594]: I1129 05:28:04.036904 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:04 crc kubenswrapper[4594]: I1129 05:28:04.036953 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:04 crc kubenswrapper[4594]: I1129 05:28:04.036966 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:04 crc kubenswrapper[4594]: I1129 05:28:04.381592 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:28:04 crc kubenswrapper[4594]: I1129 05:28:04.381714 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:04 crc kubenswrapper[4594]: I1129 05:28:04.382721 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:04 crc kubenswrapper[4594]: I1129 05:28:04.382757 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:04 crc kubenswrapper[4594]: I1129 05:28:04.382767 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:04 crc kubenswrapper[4594]: I1129 05:28:04.385367 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:28:05 crc kubenswrapper[4594]: I1129 05:28:05.142192 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:05 crc kubenswrapper[4594]: I1129 05:28:05.142309 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:28:05 crc kubenswrapper[4594]: I1129 05:28:05.143178 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:05 crc kubenswrapper[4594]: I1129 05:28:05.143217 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:05 crc kubenswrapper[4594]: I1129 05:28:05.143228 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:05 crc kubenswrapper[4594]: I1129 05:28:05.961663 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:28:06 crc kubenswrapper[4594]: E1129 05:28:06.135307 4594 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 29 05:28:06 crc kubenswrapper[4594]: I1129 05:28:06.144243 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:06 crc kubenswrapper[4594]: I1129 05:28:06.145562 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:06 crc kubenswrapper[4594]: I1129 05:28:06.145621 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:06 crc kubenswrapper[4594]: I1129 05:28:06.145632 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:06 crc kubenswrapper[4594]: I1129 05:28:06.147419 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:28:07 crc kubenswrapper[4594]: I1129 05:28:07.146435 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:07 crc kubenswrapper[4594]: I1129 05:28:07.147285 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:07 crc kubenswrapper[4594]: I1129 05:28:07.147315 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:07 crc kubenswrapper[4594]: I1129 05:28:07.147329 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:08 crc kubenswrapper[4594]: I1129 05:28:08.048245 4594 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 29 05:28:08 crc kubenswrapper[4594]: I1129 05:28:08.148638 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:08 crc kubenswrapper[4594]: I1129 05:28:08.149509 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:08 crc kubenswrapper[4594]: I1129 05:28:08.149563 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:08 crc kubenswrapper[4594]: I1129 05:28:08.149591 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:08 crc kubenswrapper[4594]: I1129 05:28:08.307166 4594 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 29 05:28:08 crc kubenswrapper[4594]: I1129 05:28:08.307211 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 29 05:28:08 crc kubenswrapper[4594]: I1129 05:28:08.312035 4594 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 29 05:28:08 crc kubenswrapper[4594]: I1129 05:28:08.312094 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 29 05:28:10 crc kubenswrapper[4594]: I1129 05:28:10.070640 4594 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 29 05:28:10 crc kubenswrapper[4594]: I1129 05:28:10.070701 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 29 05:28:10 crc kubenswrapper[4594]: I1129 05:28:10.912821 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 29 05:28:10 crc kubenswrapper[4594]: I1129 05:28:10.913080 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:10 crc kubenswrapper[4594]: I1129 05:28:10.913841 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:10 crc kubenswrapper[4594]: I1129 05:28:10.913878 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:10 crc kubenswrapper[4594]: I1129 05:28:10.913888 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:10 crc kubenswrapper[4594]: I1129 05:28:10.922317 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 29 05:28:11 crc kubenswrapper[4594]: I1129 05:28:11.155597 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:11 crc kubenswrapper[4594]: I1129 05:28:11.157210 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:11 crc kubenswrapper[4594]: I1129 05:28:11.157240 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:11 crc kubenswrapper[4594]: I1129 05:28:11.157280 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:12 crc kubenswrapper[4594]: I1129 05:28:12.950669 4594 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 29 05:28:12 crc kubenswrapper[4594]: I1129 05:28:12.950784 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 29 05:28:12 crc kubenswrapper[4594]: I1129 05:28:12.989897 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:28:12 crc kubenswrapper[4594]: I1129 05:28:12.990061 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:12 crc kubenswrapper[4594]: I1129 05:28:12.990390 4594 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 29 05:28:12 crc kubenswrapper[4594]: I1129 05:28:12.990465 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 29 05:28:12 crc kubenswrapper[4594]: I1129 05:28:12.990984 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:12 crc kubenswrapper[4594]: I1129 05:28:12.991036 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:12 crc kubenswrapper[4594]: I1129 05:28:12.991052 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:12 crc kubenswrapper[4594]: I1129 05:28:12.993998 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.159966 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.160313 4594 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.160359 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.160829 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.160867 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.160878 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:13 crc kubenswrapper[4594]: E1129 05:28:13.301638 4594 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.303603 4594 trace.go:236] Trace[1826033694]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Nov-2025 05:27:59.092) (total time: 14210ms): Nov 29 05:28:13 crc kubenswrapper[4594]: Trace[1826033694]: ---"Objects listed" error: 14210ms (05:28:13.303) Nov 29 05:28:13 crc kubenswrapper[4594]: Trace[1826033694]: [14.210785548s] [14.210785548s] END Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.303626 4594 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.303605 4594 trace.go:236] Trace[647321885]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Nov-2025 05:28:00.059) (total time: 13244ms): Nov 29 05:28:13 crc kubenswrapper[4594]: Trace[647321885]: ---"Objects listed" error: 13244ms (05:28:13.303) Nov 29 05:28:13 crc kubenswrapper[4594]: Trace[647321885]: [13.244325328s] [13.244325328s] END Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.303654 4594 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.304731 4594 trace.go:236] Trace[685276848]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Nov-2025 05:27:59.254) (total time: 14050ms): Nov 29 05:28:13 crc kubenswrapper[4594]: Trace[685276848]: ---"Objects listed" error: 14050ms (05:28:13.304) Nov 29 05:28:13 crc kubenswrapper[4594]: Trace[685276848]: [14.050487201s] [14.050487201s] END Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.304796 4594 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.308050 4594 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 29 05:28:13 crc kubenswrapper[4594]: E1129 05:28:13.309408 4594 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.310516 4594 trace.go:236] Trace[218053272]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Nov-2025 05:27:59.401) (total time: 13908ms): Nov 29 05:28:13 crc kubenswrapper[4594]: Trace[218053272]: ---"Objects listed" error: 13908ms (05:28:13.310) Nov 29 05:28:13 crc kubenswrapper[4594]: Trace[218053272]: [13.9088636s] [13.9088636s] END Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.310657 4594 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.817714 4594 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 29 05:28:13 crc kubenswrapper[4594]: I1129 05:28:13.817783 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.044831 4594 apiserver.go:52] "Watching apiserver" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.047419 4594 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.047751 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.048101 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.048169 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.048182 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.048434 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.048544 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.048606 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.048645 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.048664 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.048721 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.050414 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.050415 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.050456 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.051278 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.051315 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.051368 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.051423 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.051924 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.051925 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.074419 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.082992 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.091055 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.098174 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.105011 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.112553 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.119869 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.135119 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.147869 4594 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.163226 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.165150 4594 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392" exitCode=255 Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.165188 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392"} Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.173622 4594 scope.go:117] "RemoveContainer" containerID="3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.173752 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.174525 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.183881 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.191479 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.204445 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.211643 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.212932 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.212990 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.213011 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.213036 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.213054 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.213071 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.213087 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.213102 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.213121 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.213135 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.213937 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.214204 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.214366 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.214445 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.215358 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.215520 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.215665 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.216293 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.216668 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.217307 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.217342 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.217368 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.217387 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.217480 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.217619 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.217697 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.217732 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.217972 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218010 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218037 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218062 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218122 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.217701 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218156 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.217814 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.217941 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218197 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.217984 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218229 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218271 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218243 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218354 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218381 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218398 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218417 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218439 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218498 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218534 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218535 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218558 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218578 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.218703 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219001 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219015 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219354 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219394 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219410 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219605 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219667 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219694 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219723 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219746 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219780 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219839 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219863 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219916 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219950 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220549 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220580 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220601 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220629 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220697 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220715 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220730 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220747 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220765 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220783 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220800 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220816 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220831 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220850 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220867 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220885 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220901 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220918 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220935 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220950 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220967 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220981 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220995 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221012 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221036 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221052 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221069 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221086 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221103 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221117 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221131 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221145 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221163 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221177 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221193 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221209 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221225 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221242 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221272 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221398 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221444 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221477 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221495 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221510 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221525 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221539 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221554 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221577 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221594 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221610 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221627 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221644 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221680 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221696 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221711 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221729 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221745 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221764 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221781 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221817 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221888 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221907 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221926 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221964 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221978 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221999 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222016 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222040 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222057 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222074 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222092 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222113 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222131 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222147 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222163 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222184 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222201 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222218 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222234 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222264 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222281 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222299 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222314 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222330 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222346 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222363 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222379 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222399 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222414 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222433 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222450 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222467 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222485 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222503 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222520 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222537 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222552 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222568 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222604 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222622 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222638 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222661 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222679 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222696 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222712 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222728 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222743 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222760 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222777 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222763 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222793 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222826 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222847 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222872 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222892 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222910 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222930 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222947 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222963 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222980 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222997 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.223013 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.223900 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.223942 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.223970 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.223997 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224018 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224046 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224064 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224087 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224105 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224123 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224141 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224159 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224175 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224191 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224207 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224223 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224244 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224277 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224294 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224312 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224329 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224345 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224363 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224382 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224402 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224425 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224459 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224481 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224581 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224604 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224656 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224685 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224713 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224737 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224758 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224779 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224798 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224818 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224836 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224855 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224874 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224892 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224910 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224927 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.224999 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225011 4594 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225029 4594 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225041 4594 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225052 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225061 4594 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225070 4594 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225082 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225092 4594 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225101 4594 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225110 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225120 4594 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225130 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225140 4594 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225149 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225159 4594 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225170 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225179 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225188 4594 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225198 4594 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225208 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225218 4594 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225228 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219398 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219564 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219691 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219771 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219831 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.231493 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220564 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220714 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220732 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220757 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.219934 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220841 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.220856 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221680 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.221723 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222110 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222165 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222264 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222439 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222590 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222741 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.222775 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.223001 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.223311 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.223693 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.223907 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.223953 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225739 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.225900 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.226097 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.226156 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.226276 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.226294 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.226465 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.226486 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.226586 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.226786 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.227005 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.227344 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.228887 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:28:14.728871436 +0000 UTC m=+18.969380657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.230184 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.230242 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.230581 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.230888 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.231312 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.231356 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.231438 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.231433 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.231848 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.231896 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.232039 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.232397 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.232495 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.232646 4594 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.232701 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:14.732685427 +0000 UTC m=+18.973194647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.232817 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.233097 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.233105 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.233134 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.233171 4594 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.233212 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:14.733199149 +0000 UTC m=+18.973708369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.233225 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.233320 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.233506 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.233551 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.233581 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.233639 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.233818 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.233834 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.234160 4594 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.234176 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.234314 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.234249 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.234330 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.234719 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.234732 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.234911 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.235244 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.235384 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.235435 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.235663 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.235847 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.235890 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.235991 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.236265 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.243325 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.243349 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.243363 4594 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.243407 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:14.74339312 +0000 UTC m=+18.983902341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.245744 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.245922 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.246041 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.246504 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.246526 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.246536 4594 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.246581 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:14.746565659 +0000 UTC m=+18.987074878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.246729 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.246832 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.247359 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.246953 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.247545 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.247609 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.247630 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.247895 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.248107 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.248162 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.248836 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.248859 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.249151 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.250473 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.250496 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.250372 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.250648 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.250670 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.250977 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.251032 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.251124 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.251160 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.251476 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.251508 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.251616 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.251629 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.251630 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.251782 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.252105 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.252154 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.252227 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.252027 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.252802 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.252813 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.252934 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.252996 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.253460 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.253475 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.253647 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.253666 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.253682 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.253690 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.253713 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.253803 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.253960 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.254062 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.254105 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.254240 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.254265 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.254324 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.254426 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.253968 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.254829 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.254815 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.255146 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.255191 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.255205 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.255274 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.255241 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.255466 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.255549 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.255630 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.256300 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.256490 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.256672 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.256684 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.256763 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.257087 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.257445 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.257498 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.257190 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.257190 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.257580 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.258049 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.258142 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.258146 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.258284 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.258294 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.258354 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.258360 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.258439 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.258467 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.258487 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.258564 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.258608 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.259592 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.259977 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.260627 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.260679 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.261403 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.261464 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.261661 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.262685 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.272996 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.280710 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.281082 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.290151 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.326509 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.326728 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.326665 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.326911 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.327218 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.327327 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.327410 4594 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.327481 4594 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.327533 4594 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.327605 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.327654 4594 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.327727 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.327805 4594 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.327893 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.327947 4594 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328028 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328098 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328172 4594 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328223 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328299 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328365 4594 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328438 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328494 4594 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328568 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328642 4594 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328691 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328759 4594 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328830 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328912 4594 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.328969 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329067 4594 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329151 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329204 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329298 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329370 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329450 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329501 4594 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329567 4594 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329640 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329690 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329763 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329815 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329903 4594 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.329983 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330044 4594 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330117 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330173 4594 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330223 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330287 4594 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330343 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330393 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330675 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330731 4594 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330788 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330833 4594 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330877 4594 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330931 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.330983 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.331974 4594 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332709 4594 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332736 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332749 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332759 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332770 4594 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332811 4594 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332821 4594 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332833 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332844 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332855 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332868 4594 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332879 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332890 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332900 4594 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332913 4594 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332922 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332930 4594 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332939 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332948 4594 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332960 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332973 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332982 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.332993 4594 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333003 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333012 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333037 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333047 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333061 4594 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333072 4594 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333080 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333090 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333100 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333109 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333118 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333133 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333144 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333153 4594 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333162 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333172 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333182 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333191 4594 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333200 4594 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333211 4594 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333221 4594 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333230 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333238 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333247 4594 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333270 4594 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333279 4594 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333288 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333299 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333311 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333319 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333330 4594 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333339 4594 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333349 4594 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333358 4594 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333366 4594 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333374 4594 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333383 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333390 4594 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333399 4594 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333407 4594 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333416 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333424 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333432 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333441 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333450 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333461 4594 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333470 4594 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333478 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333487 4594 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333495 4594 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333505 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333514 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333523 4594 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333531 4594 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333541 4594 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333550 4594 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333559 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333567 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333575 4594 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333586 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333595 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333605 4594 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333613 4594 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333622 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333630 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333638 4594 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333647 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333656 4594 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333664 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333673 4594 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333683 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333691 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333699 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333707 4594 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333715 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333724 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333732 4594 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333740 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333748 4594 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333756 4594 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333764 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333773 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333781 4594 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333789 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333798 4594 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333805 4594 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333813 4594 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333824 4594 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333831 4594 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333839 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333847 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333855 4594 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.333864 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.360594 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.366689 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.371364 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 05:28:14 crc kubenswrapper[4594]: W1129 05:28:14.376961 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-8860a5cd3045d0cfcfeedf052c68e3f7f19f7c72d9d07ea65d138da2b60010c0 WatchSource:0}: Error finding container 8860a5cd3045d0cfcfeedf052c68e3f7f19f7c72d9d07ea65d138da2b60010c0: Status 404 returned error can't find the container with id 8860a5cd3045d0cfcfeedf052c68e3f7f19f7c72d9d07ea65d138da2b60010c0 Nov 29 05:28:14 crc kubenswrapper[4594]: W1129 05:28:14.381758 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-e6c730c7ea13df950c6879451502c090235a652dd8a8a787f4edeeb816a7656c WatchSource:0}: Error finding container e6c730c7ea13df950c6879451502c090235a652dd8a8a787f4edeeb816a7656c: Status 404 returned error can't find the container with id e6c730c7ea13df950c6879451502c090235a652dd8a8a787f4edeeb816a7656c Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.738117 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.738218 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.738249 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.738308 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:28:15.738282241 +0000 UTC m=+19.978791462 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.738368 4594 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.738391 4594 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.738452 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:15.738438444 +0000 UTC m=+19.978947665 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.738472 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:15.738466026 +0000 UTC m=+19.978975246 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.838613 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.838673 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.838691 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: I1129 05:28:14.838695 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.838702 4594 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.838740 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.838845 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.838865 4594 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.838809 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:15.838791739 +0000 UTC m=+20.079300969 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:14 crc kubenswrapper[4594]: E1129 05:28:14.838938 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:15.838922835 +0000 UTC m=+20.079432055 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.169687 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.171708 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351"} Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.171853 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.172949 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e6c730c7ea13df950c6879451502c090235a652dd8a8a787f4edeeb816a7656c"} Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.174778 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80"} Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.174804 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22"} Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.174814 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8860a5cd3045d0cfcfeedf052c68e3f7f19f7c72d9d07ea65d138da2b60010c0"} Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.176059 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2"} Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.176084 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fbda35ed284cfde4b5fc12fbeae6d5f99948e191790e01e2d2bfce36a381afc0"} Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.185718 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.196459 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.228363 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.244296 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.263484 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.272798 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.284069 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.293928 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.306511 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.321494 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.335131 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.345381 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.356233 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.366907 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:15Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.743606 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.743704 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.743731 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:15 crc kubenswrapper[4594]: E1129 05:28:15.743776 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:28:17.743754336 +0000 UTC m=+21.984263556 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:28:15 crc kubenswrapper[4594]: E1129 05:28:15.743847 4594 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:15 crc kubenswrapper[4594]: E1129 05:28:15.743901 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:17.743888828 +0000 UTC m=+21.984398049 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:15 crc kubenswrapper[4594]: E1129 05:28:15.743921 4594 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:15 crc kubenswrapper[4594]: E1129 05:28:15.744104 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:17.744074196 +0000 UTC m=+21.984583416 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.845085 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:15 crc kubenswrapper[4594]: I1129 05:28:15.845146 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:15 crc kubenswrapper[4594]: E1129 05:28:15.845310 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:15 crc kubenswrapper[4594]: E1129 05:28:15.845333 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:15 crc kubenswrapper[4594]: E1129 05:28:15.845328 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:15 crc kubenswrapper[4594]: E1129 05:28:15.845364 4594 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:15 crc kubenswrapper[4594]: E1129 05:28:15.845410 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:17.845398723 +0000 UTC m=+22.085907943 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:15 crc kubenswrapper[4594]: E1129 05:28:15.845364 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:15 crc kubenswrapper[4594]: E1129 05:28:15.845465 4594 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:15 crc kubenswrapper[4594]: E1129 05:28:15.845526 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:17.845510572 +0000 UTC m=+22.086019782 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.082714 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.082743 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:16 crc kubenswrapper[4594]: E1129 05:28:16.082859 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.082911 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:16 crc kubenswrapper[4594]: E1129 05:28:16.083035 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:16 crc kubenswrapper[4594]: E1129 05:28:16.083158 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.086334 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.087000 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.088101 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.088713 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.089643 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.090157 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.090743 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.091663 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.092285 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.093160 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.093659 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.094688 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.095184 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.095357 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.095817 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.096673 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.097196 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.098085 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.098528 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.099083 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.099985 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.100473 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.101367 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.101803 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.102888 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.103490 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.104287 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.105487 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.106127 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.107279 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.107626 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.107977 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.108852 4594 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.109023 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.110589 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.111482 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.111908 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.113419 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.114061 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.114903 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.115511 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.116524 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.116988 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.117030 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.117976 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.118599 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.119525 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.120049 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.120899 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.121840 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.122860 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.123377 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.124156 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.124726 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.125573 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.126126 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.126610 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.128304 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.139847 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.149994 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.161118 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.509639 4594 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.511272 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.511320 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.511332 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.511446 4594 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.518604 4594 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.518878 4594 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.520063 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.520095 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.520119 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.520133 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.520144 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:16Z","lastTransitionTime":"2025-11-29T05:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:16 crc kubenswrapper[4594]: E1129 05:28:16.537384 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.540352 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.540380 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.540388 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.540400 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.540410 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:16Z","lastTransitionTime":"2025-11-29T05:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:16 crc kubenswrapper[4594]: E1129 05:28:16.549489 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.551716 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.551807 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.551881 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.551953 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.552020 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:16Z","lastTransitionTime":"2025-11-29T05:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:16 crc kubenswrapper[4594]: E1129 05:28:16.560629 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.563416 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.563443 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.563452 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.563465 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.563477 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:16Z","lastTransitionTime":"2025-11-29T05:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:16 crc kubenswrapper[4594]: E1129 05:28:16.572131 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.574727 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.574822 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.574886 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.575017 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.575083 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:16Z","lastTransitionTime":"2025-11-29T05:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:16 crc kubenswrapper[4594]: E1129 05:28:16.583338 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:16 crc kubenswrapper[4594]: E1129 05:28:16.583444 4594 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.584419 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.584512 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.584576 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.584636 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.584687 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:16Z","lastTransitionTime":"2025-11-29T05:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.686084 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.686112 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.686122 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.686133 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.686141 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:16Z","lastTransitionTime":"2025-11-29T05:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.788306 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.788349 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.788359 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.788375 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.788385 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:16Z","lastTransitionTime":"2025-11-29T05:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.891159 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.891237 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.891248 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.891282 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.891299 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:16Z","lastTransitionTime":"2025-11-29T05:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.993549 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.993594 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.993603 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.993619 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:16 crc kubenswrapper[4594]: I1129 05:28:16.993629 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:16Z","lastTransitionTime":"2025-11-29T05:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.096212 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.096271 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.096281 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.096298 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.096314 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:17Z","lastTransitionTime":"2025-11-29T05:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.181643 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e"} Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.195554 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.198198 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.198243 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.198272 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.198288 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.198297 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:17Z","lastTransitionTime":"2025-11-29T05:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.209103 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.218925 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.229475 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.239954 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.253129 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.265367 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.301247 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.301304 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.301313 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.301332 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.301341 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:17Z","lastTransitionTime":"2025-11-29T05:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.364291 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-87h4n"] Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.364666 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-87h4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.364806 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-7plzz"] Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.365028 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: W1129 05:28:17.366639 4594 reflector.go:561] object-"openshift-dns"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.366677 4594 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.367412 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.367470 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.367967 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 29 05:28:17 crc kubenswrapper[4594]: W1129 05:28:17.368506 4594 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.368544 4594 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.368655 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.368804 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.368948 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.387455 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.398860 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.403018 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.403045 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.403054 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.403074 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.403146 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:17Z","lastTransitionTime":"2025-11-29T05:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.408296 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.418030 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.425871 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.435750 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.444500 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.453972 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.458987 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdnbj\" (UniqueName: \"kubernetes.io/projected/080d2d67-a474-4974-94f3-81c5007e0a57-kube-api-access-gdnbj\") pod \"node-resolver-87h4n\" (UID: \"080d2d67-a474-4974-94f3-81c5007e0a57\") " pod="openshift-dns/node-resolver-87h4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.459043 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/080d2d67-a474-4974-94f3-81c5007e0a57-hosts-file\") pod \"node-resolver-87h4n\" (UID: \"080d2d67-a474-4974-94f3-81c5007e0a57\") " pod="openshift-dns/node-resolver-87h4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.464720 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.482353 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.496542 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.505462 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.505821 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.505853 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.505862 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.505877 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.505886 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:17Z","lastTransitionTime":"2025-11-29T05:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.515142 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.524394 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.536047 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.545001 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.556445 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.559659 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-system-cni-dir\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.559693 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-os-release\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.559714 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-multus-socket-dir-parent\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.559741 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdnbj\" (UniqueName: \"kubernetes.io/projected/080d2d67-a474-4974-94f3-81c5007e0a57-kube-api-access-gdnbj\") pod \"node-resolver-87h4n\" (UID: \"080d2d67-a474-4974-94f3-81c5007e0a57\") " pod="openshift-dns/node-resolver-87h4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.559760 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-var-lib-cni-bin\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.559909 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-var-lib-kubelet\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.559947 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-hostroot\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.559982 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-var-lib-cni-multus\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.559999 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-run-multus-certs\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.560052 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/080d2d67-a474-4974-94f3-81c5007e0a57-hosts-file\") pod \"node-resolver-87h4n\" (UID: \"080d2d67-a474-4974-94f3-81c5007e0a57\") " pod="openshift-dns/node-resolver-87h4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.560073 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e5052790-d231-4f97-802c-c7de3cd72561-multus-daemon-config\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.560088 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qglsg\" (UniqueName: \"kubernetes.io/projected/e5052790-d231-4f97-802c-c7de3cd72561-kube-api-access-qglsg\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.560102 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-cnibin\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.560122 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-run-netns\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.560137 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-multus-conf-dir\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.560154 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-etc-kubernetes\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.560203 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/080d2d67-a474-4974-94f3-81c5007e0a57-hosts-file\") pod \"node-resolver-87h4n\" (UID: \"080d2d67-a474-4974-94f3-81c5007e0a57\") " pod="openshift-dns/node-resolver-87h4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.560272 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-multus-cni-dir\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.560292 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-run-k8s-cni-cncf-io\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.560341 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e5052790-d231-4f97-802c-c7de3cd72561-cni-binary-copy\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.608180 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.608218 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.608228 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.608247 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.608277 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:17Z","lastTransitionTime":"2025-11-29T05:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661447 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e5052790-d231-4f97-802c-c7de3cd72561-multus-daemon-config\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661482 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qglsg\" (UniqueName: \"kubernetes.io/projected/e5052790-d231-4f97-802c-c7de3cd72561-kube-api-access-qglsg\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661507 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-cnibin\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661526 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-run-netns\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661546 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-multus-cni-dir\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661564 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-run-k8s-cni-cncf-io\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661588 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-multus-conf-dir\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661611 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-etc-kubernetes\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661637 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e5052790-d231-4f97-802c-c7de3cd72561-cni-binary-copy\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661654 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-os-release\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661665 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-run-k8s-cni-cncf-io\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661694 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-multus-cni-dir\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661672 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-multus-socket-dir-parent\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661838 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-system-cni-dir\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661724 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-multus-socket-dir-parent\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661872 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-var-lib-cni-bin\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661731 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-run-netns\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661767 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-os-release\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661768 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-etc-kubernetes\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661633 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-cnibin\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661735 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-multus-conf-dir\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.661991 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-var-lib-cni-bin\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.662008 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-system-cni-dir\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.662080 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-var-lib-kubelet\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.662105 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-hostroot\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.662135 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-hostroot\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.662155 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-var-lib-cni-multus\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.662204 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-var-lib-cni-multus\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.662199 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-var-lib-kubelet\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.662234 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-run-multus-certs\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.662250 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e5052790-d231-4f97-802c-c7de3cd72561-host-run-multus-certs\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.662537 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e5052790-d231-4f97-802c-c7de3cd72561-multus-daemon-config\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.677087 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qglsg\" (UniqueName: \"kubernetes.io/projected/e5052790-d231-4f97-802c-c7de3cd72561-kube-api-access-qglsg\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.710722 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.711186 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.711211 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.711241 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.711277 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:17Z","lastTransitionTime":"2025-11-29T05:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.758573 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lp4zm"] Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.759310 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-ggz4n"] Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.760178 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zwqcq"] Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.762362 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.762740 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.763634 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.763765 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.763936 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.763950 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.763992 4594 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.764049 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:21.764033556 +0000 UTC m=+26.004542777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.764121 4594 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.764160 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:28:21.764138072 +0000 UTC m=+26.004647293 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.764185 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:21.764176334 +0000 UTC m=+26.004685554 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.764544 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.770397 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.770579 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.770765 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.770891 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.770995 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.771106 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.771437 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.771468 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.771569 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.771661 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.771689 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.771818 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.771823 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.795298 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.813366 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.813422 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.813433 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.813447 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.813458 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:17Z","lastTransitionTime":"2025-11-29T05:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.825355 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.845039 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.856728 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.864918 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.864955 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-os-release\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.864973 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovn-node-metrics-cert\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.864992 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovnkube-script-lib\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865018 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-run-ovn-kubernetes\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865089 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-var-lib-openvswitch\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865147 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865171 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-ovn\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865193 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-cni-bin\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865227 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-slash\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.865234 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.865274 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.865289 4594 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.865335 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:21.865320953 +0000 UTC m=+26.105830173 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865248 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-systemd\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865379 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865399 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovnkube-config\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865432 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-system-cni-dir\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865452 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/509844d4-3ee1-4059-84bb-6e90200f50c5-rootfs\") pod \"machine-config-daemon-ggz4n\" (UID: \"509844d4-3ee1-4059-84bb-6e90200f50c5\") " pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865469 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-cni-netd\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865486 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjws8\" (UniqueName: \"kubernetes.io/projected/08de5891-72ca-488c-80b3-6b54c8c1a66e-kube-api-access-cjws8\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865515 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-cni-binary-copy\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865531 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-env-overrides\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865604 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865662 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/509844d4-3ee1-4059-84bb-6e90200f50c5-mcd-auth-proxy-config\") pod \"machine-config-daemon-ggz4n\" (UID: \"509844d4-3ee1-4059-84bb-6e90200f50c5\") " pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865688 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-run-netns\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865706 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-etc-openvswitch\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.865720 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.865743 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.865757 4594 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865722 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-node-log\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865831 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-kubelet\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: E1129 05:28:17.865857 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:21.865834746 +0000 UTC m=+26.106343957 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865898 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-cnibin\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865926 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnd6t\" (UniqueName: \"kubernetes.io/projected/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-kube-api-access-bnd6t\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865947 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-openvswitch\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.865963 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-log-socket\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.866037 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/509844d4-3ee1-4059-84bb-6e90200f50c5-proxy-tls\") pod \"machine-config-daemon-ggz4n\" (UID: \"509844d4-3ee1-4059-84bb-6e90200f50c5\") " pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.866069 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4qlc\" (UniqueName: \"kubernetes.io/projected/509844d4-3ee1-4059-84bb-6e90200f50c5-kube-api-access-f4qlc\") pod \"machine-config-daemon-ggz4n\" (UID: \"509844d4-3ee1-4059-84bb-6e90200f50c5\") " pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.866093 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.866156 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-systemd-units\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.872364 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.882955 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.895276 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.904635 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.914720 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.915759 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.915793 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.915803 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.915818 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.915827 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:17Z","lastTransitionTime":"2025-11-29T05:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.927267 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.936405 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.945957 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.955174 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.964082 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.966898 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.966931 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-ovn\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.966953 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-cni-bin\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.966987 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-slash\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967010 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-systemd\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967033 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967050 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovnkube-config\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967048 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-ovn\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967076 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-cni-bin\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967099 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-systemd\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967070 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-system-cni-dir\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967116 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967135 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-system-cni-dir\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967149 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/509844d4-3ee1-4059-84bb-6e90200f50c5-rootfs\") pod \"machine-config-daemon-ggz4n\" (UID: \"509844d4-3ee1-4059-84bb-6e90200f50c5\") " pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967130 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-slash\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967241 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-cni-netd\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967287 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-cni-netd\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967296 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjws8\" (UniqueName: \"kubernetes.io/projected/08de5891-72ca-488c-80b3-6b54c8c1a66e-kube-api-access-cjws8\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967245 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/509844d4-3ee1-4059-84bb-6e90200f50c5-rootfs\") pod \"machine-config-daemon-ggz4n\" (UID: \"509844d4-3ee1-4059-84bb-6e90200f50c5\") " pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967361 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-cni-binary-copy\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967398 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-env-overrides\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967439 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/509844d4-3ee1-4059-84bb-6e90200f50c5-mcd-auth-proxy-config\") pod \"machine-config-daemon-ggz4n\" (UID: \"509844d4-3ee1-4059-84bb-6e90200f50c5\") " pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967456 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-kubelet\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967471 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-run-netns\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967489 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-etc-openvswitch\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967505 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-node-log\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967506 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-kubelet\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967520 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-run-netns\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967549 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-cnibin\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967555 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-node-log\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967534 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-etc-openvswitch\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967529 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-cnibin\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967590 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnd6t\" (UniqueName: \"kubernetes.io/projected/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-kube-api-access-bnd6t\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967609 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-openvswitch\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967625 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-log-socket\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967642 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-openvswitch\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967645 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/509844d4-3ee1-4059-84bb-6e90200f50c5-proxy-tls\") pod \"machine-config-daemon-ggz4n\" (UID: \"509844d4-3ee1-4059-84bb-6e90200f50c5\") " pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967671 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-log-socket\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967688 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4qlc\" (UniqueName: \"kubernetes.io/projected/509844d4-3ee1-4059-84bb-6e90200f50c5-kube-api-access-f4qlc\") pod \"machine-config-daemon-ggz4n\" (UID: \"509844d4-3ee1-4059-84bb-6e90200f50c5\") " pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967710 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967728 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-systemd-units\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967760 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-os-release\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967777 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovn-node-metrics-cert\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967783 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-systemd-units\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967790 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovnkube-script-lib\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967758 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovnkube-config\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967817 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-run-ovn-kubernetes\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967838 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-var-lib-openvswitch\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967865 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-os-release\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967882 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-var-lib-openvswitch\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967894 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-run-ovn-kubernetes\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.967939 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-env-overrides\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.968132 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/509844d4-3ee1-4059-84bb-6e90200f50c5-mcd-auth-proxy-config\") pod \"machine-config-daemon-ggz4n\" (UID: \"509844d4-3ee1-4059-84bb-6e90200f50c5\") " pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.968284 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovnkube-script-lib\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.968295 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.968717 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.972307 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/509844d4-3ee1-4059-84bb-6e90200f50c5-proxy-tls\") pod \"machine-config-daemon-ggz4n\" (UID: \"509844d4-3ee1-4059-84bb-6e90200f50c5\") " pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.972330 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovn-node-metrics-cert\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.974558 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.984019 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4qlc\" (UniqueName: \"kubernetes.io/projected/509844d4-3ee1-4059-84bb-6e90200f50c5-kube-api-access-f4qlc\") pod \"machine-config-daemon-ggz4n\" (UID: \"509844d4-3ee1-4059-84bb-6e90200f50c5\") " pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.986389 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnd6t\" (UniqueName: \"kubernetes.io/projected/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-kube-api-access-bnd6t\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.990200 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjws8\" (UniqueName: \"kubernetes.io/projected/08de5891-72ca-488c-80b3-6b54c8c1a66e-kube-api-access-cjws8\") pod \"ovnkube-node-lp4zm\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:17 crc kubenswrapper[4594]: I1129 05:28:17.991924 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:17Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.001667 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.010770 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.018148 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.018186 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.018198 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.018214 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.018226 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:18Z","lastTransitionTime":"2025-11-29T05:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.020408 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.029479 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.044051 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.054851 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.074349 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.079548 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.084230 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.084274 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:18 crc kubenswrapper[4594]: E1129 05:28:18.084448 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:18 crc kubenswrapper[4594]: E1129 05:28:18.084538 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.087110 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:18 crc kubenswrapper[4594]: E1129 05:28:18.087217 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:18 crc kubenswrapper[4594]: W1129 05:28:18.092477 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08de5891_72ca_488c_80b3_6b54c8c1a66e.slice/crio-ebad62949c66a06bbc352e12d8a068e3240c97cfceef191222d7dc266badfcd2 WatchSource:0}: Error finding container ebad62949c66a06bbc352e12d8a068e3240c97cfceef191222d7dc266badfcd2: Status 404 returned error can't find the container with id ebad62949c66a06bbc352e12d8a068e3240c97cfceef191222d7dc266badfcd2 Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.120805 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.120838 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.120848 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.120863 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.120873 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:18Z","lastTransitionTime":"2025-11-29T05:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.184314 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5"} Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.184349 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"4614de7710091620c7297139acf684e89fb43b853e817e7aeeba6ca01e83f4a8"} Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.185200 4594 generic.go:334] "Generic (PLEG): container finished" podID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerID="b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898" exitCode=0 Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.185523 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898"} Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.185546 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerStarted","Data":"ebad62949c66a06bbc352e12d8a068e3240c97cfceef191222d7dc266badfcd2"} Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.194768 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.205523 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.215972 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.223420 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.223450 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.223460 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.223471 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.223479 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:18Z","lastTransitionTime":"2025-11-29T05:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.227015 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.236790 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.247116 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.256835 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.271722 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.287217 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.296664 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.310742 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.325386 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.325420 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.325432 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.325448 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.325458 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:18Z","lastTransitionTime":"2025-11-29T05:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.327179 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.411458 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.416501 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdnbj\" (UniqueName: \"kubernetes.io/projected/080d2d67-a474-4974-94f3-81c5007e0a57-kube-api-access-gdnbj\") pod \"node-resolver-87h4n\" (UID: \"080d2d67-a474-4974-94f3-81c5007e0a57\") " pod="openshift-dns/node-resolver-87h4n" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.427373 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.427409 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.427418 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.427434 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.427443 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:18Z","lastTransitionTime":"2025-11-29T05:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.529026 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.529251 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.529274 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.529287 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.529296 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:18Z","lastTransitionTime":"2025-11-29T05:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.578786 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-87h4n" Nov 29 05:28:18 crc kubenswrapper[4594]: W1129 05:28:18.588168 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod080d2d67_a474_4974_94f3_81c5007e0a57.slice/crio-8803bba7ce6e950cacc64a86ee10a0eaffe9be030d444096be2c223d9d481e70 WatchSource:0}: Error finding container 8803bba7ce6e950cacc64a86ee10a0eaffe9be030d444096be2c223d9d481e70: Status 404 returned error can't find the container with id 8803bba7ce6e950cacc64a86ee10a0eaffe9be030d444096be2c223d9d481e70 Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.618023 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.623021 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e5052790-d231-4f97-802c-c7de3cd72561-cni-binary-copy\") pod \"multus-7plzz\" (UID: \"e5052790-d231-4f97-802c-c7de3cd72561\") " pod="openshift-multus/multus-7plzz" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.628099 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ad2c26bf-c1c8-44b5-b4ed-d487072d358b-cni-binary-copy\") pod \"multus-additional-cni-plugins-zwqcq\" (UID: \"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\") " pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.631214 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.631240 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.631275 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.631292 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.631301 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:18Z","lastTransitionTime":"2025-11-29T05:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.684930 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" Nov 29 05:28:18 crc kubenswrapper[4594]: W1129 05:28:18.696235 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad2c26bf_c1c8_44b5_b4ed_d487072d358b.slice/crio-05d409b0562b6dc43d44b94689bc0275b82949b7d237bfc2c4ac8942d9838f39 WatchSource:0}: Error finding container 05d409b0562b6dc43d44b94689bc0275b82949b7d237bfc2c4ac8942d9838f39: Status 404 returned error can't find the container with id 05d409b0562b6dc43d44b94689bc0275b82949b7d237bfc2c4ac8942d9838f39 Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.734287 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.734549 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.734559 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.734604 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.734617 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:18Z","lastTransitionTime":"2025-11-29T05:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.837225 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.837288 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.837298 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.837310 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.837319 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:18Z","lastTransitionTime":"2025-11-29T05:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.884141 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7plzz" Nov 29 05:28:18 crc kubenswrapper[4594]: W1129 05:28:18.931790 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5052790_d231_4f97_802c_c7de3cd72561.slice/crio-00cb27d498f35751901c3edc0cfb6d78064c5e4e847c29f25fd65d3e9833067f WatchSource:0}: Error finding container 00cb27d498f35751901c3edc0cfb6d78064c5e4e847c29f25fd65d3e9833067f: Status 404 returned error can't find the container with id 00cb27d498f35751901c3edc0cfb6d78064c5e4e847c29f25fd65d3e9833067f Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.939202 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.939229 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.939269 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.939282 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:18 crc kubenswrapper[4594]: I1129 05:28:18.939290 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:18Z","lastTransitionTime":"2025-11-29T05:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.041051 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.041088 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.041098 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.041113 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.041123 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:19Z","lastTransitionTime":"2025-11-29T05:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.144988 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.145279 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.145293 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.145310 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.145320 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:19Z","lastTransitionTime":"2025-11-29T05:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.191018 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.193385 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7plzz" event={"ID":"e5052790-d231-4f97-802c-c7de3cd72561","Type":"ContainerStarted","Data":"4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.193438 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7plzz" event={"ID":"e5052790-d231-4f97-802c-c7de3cd72561","Type":"ContainerStarted","Data":"00cb27d498f35751901c3edc0cfb6d78064c5e4e847c29f25fd65d3e9833067f"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.196567 4594 generic.go:334] "Generic (PLEG): container finished" podID="ad2c26bf-c1c8-44b5-b4ed-d487072d358b" containerID="7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222" exitCode=0 Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.196626 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" event={"ID":"ad2c26bf-c1c8-44b5-b4ed-d487072d358b","Type":"ContainerDied","Data":"7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.196643 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" event={"ID":"ad2c26bf-c1c8-44b5-b4ed-d487072d358b","Type":"ContainerStarted","Data":"05d409b0562b6dc43d44b94689bc0275b82949b7d237bfc2c4ac8942d9838f39"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.205023 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerStarted","Data":"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.207836 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerStarted","Data":"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.207868 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerStarted","Data":"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.207880 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerStarted","Data":"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.207889 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerStarted","Data":"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.207897 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerStarted","Data":"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.210598 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-87h4n" event={"ID":"080d2d67-a474-4974-94f3-81c5007e0a57","Type":"ContainerStarted","Data":"5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.210633 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-87h4n" event={"ID":"080d2d67-a474-4974-94f3-81c5007e0a57","Type":"ContainerStarted","Data":"8803bba7ce6e950cacc64a86ee10a0eaffe9be030d444096be2c223d9d481e70"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.211820 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.227589 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.235748 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.246695 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.246731 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.246762 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.246778 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.246787 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:19Z","lastTransitionTime":"2025-11-29T05:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.247194 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.256574 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.266873 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.278667 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.288408 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.302327 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.312651 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.323791 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.332370 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.342736 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.349740 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.349773 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.349783 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.349797 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.349806 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:19Z","lastTransitionTime":"2025-11-29T05:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.354853 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.363301 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.371590 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.379693 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.392437 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.402018 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.409119 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.417901 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.426381 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.433224 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.445927 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.451300 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.451337 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.451346 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.451360 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.451370 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:19Z","lastTransitionTime":"2025-11-29T05:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.553889 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.553932 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.553941 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.553958 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.553966 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:19Z","lastTransitionTime":"2025-11-29T05:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.585594 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-26zjk"] Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.585948 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-26zjk" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.586396 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77af4f9f-196d-436d-9fb6-69168bcb5f8f-host\") pod \"node-ca-26zjk\" (UID: \"77af4f9f-196d-436d-9fb6-69168bcb5f8f\") " pod="openshift-image-registry/node-ca-26zjk" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.586445 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77af4f9f-196d-436d-9fb6-69168bcb5f8f-serviceca\") pod \"node-ca-26zjk\" (UID: \"77af4f9f-196d-436d-9fb6-69168bcb5f8f\") " pod="openshift-image-registry/node-ca-26zjk" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.586592 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxrks\" (UniqueName: \"kubernetes.io/projected/77af4f9f-196d-436d-9fb6-69168bcb5f8f-kube-api-access-zxrks\") pod \"node-ca-26zjk\" (UID: \"77af4f9f-196d-436d-9fb6-69168bcb5f8f\") " pod="openshift-image-registry/node-ca-26zjk" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.588181 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.588294 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.588374 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.588405 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.596804 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.606383 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.615861 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.625971 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.635029 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.650355 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.656611 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.656644 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.656655 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.656671 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.656680 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:19Z","lastTransitionTime":"2025-11-29T05:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.659948 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.668988 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.677194 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.685059 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.687215 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77af4f9f-196d-436d-9fb6-69168bcb5f8f-host\") pod \"node-ca-26zjk\" (UID: \"77af4f9f-196d-436d-9fb6-69168bcb5f8f\") " pod="openshift-image-registry/node-ca-26zjk" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.687251 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77af4f9f-196d-436d-9fb6-69168bcb5f8f-serviceca\") pod \"node-ca-26zjk\" (UID: \"77af4f9f-196d-436d-9fb6-69168bcb5f8f\") " pod="openshift-image-registry/node-ca-26zjk" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.687310 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77af4f9f-196d-436d-9fb6-69168bcb5f8f-host\") pod \"node-ca-26zjk\" (UID: \"77af4f9f-196d-436d-9fb6-69168bcb5f8f\") " pod="openshift-image-registry/node-ca-26zjk" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.687335 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxrks\" (UniqueName: \"kubernetes.io/projected/77af4f9f-196d-436d-9fb6-69168bcb5f8f-kube-api-access-zxrks\") pod \"node-ca-26zjk\" (UID: \"77af4f9f-196d-436d-9fb6-69168bcb5f8f\") " pod="openshift-image-registry/node-ca-26zjk" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.688327 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77af4f9f-196d-436d-9fb6-69168bcb5f8f-serviceca\") pod \"node-ca-26zjk\" (UID: \"77af4f9f-196d-436d-9fb6-69168bcb5f8f\") " pod="openshift-image-registry/node-ca-26zjk" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.694532 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.703606 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxrks\" (UniqueName: \"kubernetes.io/projected/77af4f9f-196d-436d-9fb6-69168bcb5f8f-kube-api-access-zxrks\") pod \"node-ca-26zjk\" (UID: \"77af4f9f-196d-436d-9fb6-69168bcb5f8f\") " pod="openshift-image-registry/node-ca-26zjk" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.704309 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.713618 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:19Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.758586 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.758633 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.758647 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.758672 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.758683 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:19Z","lastTransitionTime":"2025-11-29T05:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.861326 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.861369 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.861379 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.861400 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.861410 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:19Z","lastTransitionTime":"2025-11-29T05:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.946916 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-26zjk" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.963276 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.963320 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.963332 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.963351 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:19 crc kubenswrapper[4594]: I1129 05:28:19.963367 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:19Z","lastTransitionTime":"2025-11-29T05:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:19 crc kubenswrapper[4594]: W1129 05:28:19.963761 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77af4f9f_196d_436d_9fb6_69168bcb5f8f.slice/crio-57679ee9923896eb3e782a6e8180314bf3150d3fe6863c7e9a68e234ff044f29 WatchSource:0}: Error finding container 57679ee9923896eb3e782a6e8180314bf3150d3fe6863c7e9a68e234ff044f29: Status 404 returned error can't find the container with id 57679ee9923896eb3e782a6e8180314bf3150d3fe6863c7e9a68e234ff044f29 Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.066182 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.066217 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.066226 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.066240 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.066250 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:20Z","lastTransitionTime":"2025-11-29T05:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.082923 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:20 crc kubenswrapper[4594]: E1129 05:28:20.083114 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.083189 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.083298 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:20 crc kubenswrapper[4594]: E1129 05:28:20.083395 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:20 crc kubenswrapper[4594]: E1129 05:28:20.083526 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.168536 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.168579 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.168590 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.168609 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.168624 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:20Z","lastTransitionTime":"2025-11-29T05:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.214740 4594 generic.go:334] "Generic (PLEG): container finished" podID="ad2c26bf-c1c8-44b5-b4ed-d487072d358b" containerID="9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772" exitCode=0 Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.214815 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" event={"ID":"ad2c26bf-c1c8-44b5-b4ed-d487072d358b","Type":"ContainerDied","Data":"9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772"} Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.217041 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-26zjk" event={"ID":"77af4f9f-196d-436d-9fb6-69168bcb5f8f","Type":"ContainerStarted","Data":"70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0"} Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.217080 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-26zjk" event={"ID":"77af4f9f-196d-436d-9fb6-69168bcb5f8f","Type":"ContainerStarted","Data":"57679ee9923896eb3e782a6e8180314bf3150d3fe6863c7e9a68e234ff044f29"} Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.228981 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.238724 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.249573 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.259617 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.270352 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.270390 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.270401 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.270416 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.270426 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:20Z","lastTransitionTime":"2025-11-29T05:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.274728 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.285473 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.294928 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.302326 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.316158 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.326771 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.334867 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.344845 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.354749 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.364557 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.372090 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.373606 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.373636 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.373646 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.373662 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.373673 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:20Z","lastTransitionTime":"2025-11-29T05:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.379887 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.391027 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.400145 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.428538 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.468519 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.476453 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.476483 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.476496 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.476513 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.476526 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:20Z","lastTransitionTime":"2025-11-29T05:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.509175 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.549451 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.579401 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.579798 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.579814 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.579835 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.579848 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:20Z","lastTransitionTime":"2025-11-29T05:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.591608 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.643962 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.682327 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.682358 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.682368 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.682381 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.682390 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:20Z","lastTransitionTime":"2025-11-29T05:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.690283 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.716580 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.785228 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.785285 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.785298 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.785319 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.785331 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:20Z","lastTransitionTime":"2025-11-29T05:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.820240 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.823897 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.829119 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.831749 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.843125 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.852351 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.887375 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.887412 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.887421 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.887439 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.887454 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:20Z","lastTransitionTime":"2025-11-29T05:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.892165 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.929119 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.970312 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:20Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.990067 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.990107 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.990119 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.990137 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:20 crc kubenswrapper[4594]: I1129 05:28:20.990152 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:20Z","lastTransitionTime":"2025-11-29T05:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.009583 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.050675 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.089549 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.092398 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.092442 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.092455 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.092474 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.092488 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:21Z","lastTransitionTime":"2025-11-29T05:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.130028 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.171105 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.194662 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.194699 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.194710 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.194723 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.194735 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:21Z","lastTransitionTime":"2025-11-29T05:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.213056 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.222941 4594 generic.go:334] "Generic (PLEG): container finished" podID="ad2c26bf-c1c8-44b5-b4ed-d487072d358b" containerID="ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a" exitCode=0 Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.223040 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" event={"ID":"ad2c26bf-c1c8-44b5-b4ed-d487072d358b","Type":"ContainerDied","Data":"ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a"} Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.232272 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerStarted","Data":"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004"} Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.252501 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.290271 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.296652 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.296690 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.296702 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.296726 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.296740 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:21Z","lastTransitionTime":"2025-11-29T05:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.330363 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.368308 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.399327 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.399348 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.399357 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.399374 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.399384 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:21Z","lastTransitionTime":"2025-11-29T05:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.411302 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.450558 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.491122 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.502601 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.502640 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.502651 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.502666 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.502680 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:21Z","lastTransitionTime":"2025-11-29T05:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.535310 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.571642 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.604775 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.604812 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.604823 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.604839 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.604851 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:21Z","lastTransitionTime":"2025-11-29T05:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.609547 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.649743 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.688694 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.707482 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.707518 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.707529 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.707551 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.707564 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:21Z","lastTransitionTime":"2025-11-29T05:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.730023 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.769065 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.802657 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.802764 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.802793 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:21 crc kubenswrapper[4594]: E1129 05:28:21.802855 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:28:29.802826077 +0000 UTC m=+34.043335316 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:28:21 crc kubenswrapper[4594]: E1129 05:28:21.802929 4594 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:21 crc kubenswrapper[4594]: E1129 05:28:21.802941 4594 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:21 crc kubenswrapper[4594]: E1129 05:28:21.803012 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:29.802996457 +0000 UTC m=+34.043505676 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:21 crc kubenswrapper[4594]: E1129 05:28:21.803043 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:29.803032704 +0000 UTC m=+34.043541924 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.808662 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:21Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.809916 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.809948 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.809962 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.809976 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.809998 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:21Z","lastTransitionTime":"2025-11-29T05:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.903584 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.903634 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:21 crc kubenswrapper[4594]: E1129 05:28:21.903753 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:21 crc kubenswrapper[4594]: E1129 05:28:21.903766 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:21 crc kubenswrapper[4594]: E1129 05:28:21.903774 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:21 crc kubenswrapper[4594]: E1129 05:28:21.903780 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:21 crc kubenswrapper[4594]: E1129 05:28:21.903788 4594 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:21 crc kubenswrapper[4594]: E1129 05:28:21.903791 4594 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:21 crc kubenswrapper[4594]: E1129 05:28:21.903839 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:29.903826475 +0000 UTC m=+34.144335695 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:21 crc kubenswrapper[4594]: E1129 05:28:21.903856 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:29.903851182 +0000 UTC m=+34.144360402 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.911975 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.912017 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.912029 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.912045 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:21 crc kubenswrapper[4594]: I1129 05:28:21.912058 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:21Z","lastTransitionTime":"2025-11-29T05:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.014205 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.014246 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.014272 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.014288 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.014299 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:22Z","lastTransitionTime":"2025-11-29T05:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.083160 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.083175 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.083400 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:22 crc kubenswrapper[4594]: E1129 05:28:22.083360 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:22 crc kubenswrapper[4594]: E1129 05:28:22.083483 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:22 crc kubenswrapper[4594]: E1129 05:28:22.083708 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.116647 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.116695 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.116708 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.116721 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.116731 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:22Z","lastTransitionTime":"2025-11-29T05:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.219900 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.219946 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.219958 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.219990 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.220004 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:22Z","lastTransitionTime":"2025-11-29T05:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.237461 4594 generic.go:334] "Generic (PLEG): container finished" podID="ad2c26bf-c1c8-44b5-b4ed-d487072d358b" containerID="819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed" exitCode=0 Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.237508 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" event={"ID":"ad2c26bf-c1c8-44b5-b4ed-d487072d358b","Type":"ContainerDied","Data":"819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed"} Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.254299 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.262790 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.272999 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.283270 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.292140 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.301835 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.312126 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.321122 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.322732 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.322777 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.322792 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.322809 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.322821 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:22Z","lastTransitionTime":"2025-11-29T05:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.337373 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.350053 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.359278 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.367773 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.378019 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.388317 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:22Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.425141 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.425180 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.425192 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.425213 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.425228 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:22Z","lastTransitionTime":"2025-11-29T05:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.527745 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.527778 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.527788 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.527804 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.527814 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:22Z","lastTransitionTime":"2025-11-29T05:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.630415 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.630454 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.630464 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.630484 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.630498 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:22Z","lastTransitionTime":"2025-11-29T05:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.732340 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.732366 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.732376 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.732396 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.732408 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:22Z","lastTransitionTime":"2025-11-29T05:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.835656 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.835895 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.835910 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.835931 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.835942 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:22Z","lastTransitionTime":"2025-11-29T05:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.938936 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.939013 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.939029 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.939053 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:22 crc kubenswrapper[4594]: I1129 05:28:22.939069 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:22Z","lastTransitionTime":"2025-11-29T05:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.040892 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.040928 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.040938 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.040954 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.040968 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:23Z","lastTransitionTime":"2025-11-29T05:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.143864 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.143893 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.143905 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.143917 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.143928 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:23Z","lastTransitionTime":"2025-11-29T05:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.244504 4594 generic.go:334] "Generic (PLEG): container finished" podID="ad2c26bf-c1c8-44b5-b4ed-d487072d358b" containerID="1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73" exitCode=0 Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.244615 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" event={"ID":"ad2c26bf-c1c8-44b5-b4ed-d487072d358b","Type":"ContainerDied","Data":"1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73"} Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.245694 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.245724 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.245736 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.245751 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.245763 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:23Z","lastTransitionTime":"2025-11-29T05:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.251709 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerStarted","Data":"5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590"} Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.252004 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.252032 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.255939 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.265301 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.275572 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.279982 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.293227 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.307310 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.317181 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.325556 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.332771 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.343889 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.348424 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.348460 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.348474 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.348494 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.348506 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:23Z","lastTransitionTime":"2025-11-29T05:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.355557 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.363494 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.373455 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.384736 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.394645 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.404378 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.414269 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.424034 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.437071 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.446590 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.450300 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.450328 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.450337 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.450351 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.450362 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:23Z","lastTransitionTime":"2025-11-29T05:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.455146 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.464236 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.472915 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.482998 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.490982 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.498089 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.506967 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.515193 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.522661 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:23Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.552558 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.552594 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.552605 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.552620 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.552631 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:23Z","lastTransitionTime":"2025-11-29T05:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.654538 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.654572 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.654582 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.654597 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.654608 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:23Z","lastTransitionTime":"2025-11-29T05:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.756078 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.756107 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.756115 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.756128 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.756139 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:23Z","lastTransitionTime":"2025-11-29T05:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.858226 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.858281 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.858294 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.858307 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.858318 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:23Z","lastTransitionTime":"2025-11-29T05:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.960946 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.961007 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.961023 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.961045 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:23 crc kubenswrapper[4594]: I1129 05:28:23.961060 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:23Z","lastTransitionTime":"2025-11-29T05:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.063354 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.063388 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.063399 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.063410 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.063420 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:24Z","lastTransitionTime":"2025-11-29T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.083107 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.083129 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.083143 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:24 crc kubenswrapper[4594]: E1129 05:28:24.083248 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:24 crc kubenswrapper[4594]: E1129 05:28:24.083357 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:24 crc kubenswrapper[4594]: E1129 05:28:24.083427 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.165151 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.165184 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.165802 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.165831 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.165844 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:24Z","lastTransitionTime":"2025-11-29T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.257834 4594 generic.go:334] "Generic (PLEG): container finished" podID="ad2c26bf-c1c8-44b5-b4ed-d487072d358b" containerID="81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554" exitCode=0 Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.258771 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" event={"ID":"ad2c26bf-c1c8-44b5-b4ed-d487072d358b","Type":"ContainerDied","Data":"81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554"} Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.259280 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.271600 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.271633 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.271644 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.271663 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.271679 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:24Z","lastTransitionTime":"2025-11-29T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.273106 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.286499 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.293919 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.302513 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.311128 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.321973 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.331296 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.341391 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.351427 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.363638 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.374470 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.374527 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.374542 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.374566 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.374580 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:24Z","lastTransitionTime":"2025-11-29T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.380084 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.390503 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.401154 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.410824 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.419939 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.430449 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.441977 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.450200 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.459682 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.469909 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.477873 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.477913 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.477924 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.477946 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.477961 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:24Z","lastTransitionTime":"2025-11-29T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.479853 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.489996 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.499195 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.512517 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.525018 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.538036 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.548218 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.568916 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.581181 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.581219 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.581229 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.581275 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.581289 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:24Z","lastTransitionTime":"2025-11-29T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.608175 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:24Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.684361 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.684400 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.684414 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.684433 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.684749 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:24Z","lastTransitionTime":"2025-11-29T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.787312 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.787348 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.787358 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.787372 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.787380 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:24Z","lastTransitionTime":"2025-11-29T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.889513 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.889551 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.889564 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.889581 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.889591 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:24Z","lastTransitionTime":"2025-11-29T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.991794 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.991826 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.991836 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.991846 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:24 crc kubenswrapper[4594]: I1129 05:28:24.991855 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:24Z","lastTransitionTime":"2025-11-29T05:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.094275 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.094321 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.094332 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.094352 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.094364 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:25Z","lastTransitionTime":"2025-11-29T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.196327 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.196369 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.196378 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.196394 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.196404 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:25Z","lastTransitionTime":"2025-11-29T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.263861 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/0.log" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.266829 4594 generic.go:334] "Generic (PLEG): container finished" podID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerID="5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590" exitCode=1 Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.266887 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590"} Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.267416 4594 scope.go:117] "RemoveContainer" containerID="5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.271016 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" event={"ID":"ad2c26bf-c1c8-44b5-b4ed-d487072d358b","Type":"ContainerStarted","Data":"9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c"} Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.281041 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.292178 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.298281 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.298315 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.298325 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.298337 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.298347 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:25Z","lastTransitionTime":"2025-11-29T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.301374 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.310692 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.322058 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.331557 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.341526 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.351732 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.364492 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"message\\\":\\\"shutting down\\\\nI1129 05:28:24.803701 5946 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.803238 5946 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1129 05:28:24.803981 5946 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1129 05:28:24.803455 5946 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.804214 5946 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 05:28:24.803041 5946 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 05:28:24.804392 5946 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.804431 5946 factory.go:656] Stopping watch factory\\\\nI1129 05:28:24.804445 5946 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 05:28:24.804921 5946 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.805008 5946 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.376228 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.386678 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.397173 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.402560 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.402607 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.402623 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.402642 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.402659 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:25Z","lastTransitionTime":"2025-11-29T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.410549 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.419317 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.427379 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.435943 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.445311 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.453636 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.462242 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.474795 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"message\\\":\\\"shutting down\\\\nI1129 05:28:24.803701 5946 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.803238 5946 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1129 05:28:24.803981 5946 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1129 05:28:24.803455 5946 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.804214 5946 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 05:28:24.803041 5946 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 05:28:24.804392 5946 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.804431 5946 factory.go:656] Stopping watch factory\\\\nI1129 05:28:24.804445 5946 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 05:28:24.804921 5946 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.805008 5946 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.485076 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.493068 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.505140 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.505167 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.505177 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.505197 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.505209 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:25Z","lastTransitionTime":"2025-11-29T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.530359 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.568921 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.607574 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.607595 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.607604 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.607617 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.607632 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:25Z","lastTransitionTime":"2025-11-29T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.610023 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.651171 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.691034 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.710300 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.710335 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.710345 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.710365 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.710376 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:25Z","lastTransitionTime":"2025-11-29T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.729227 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:25Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.812884 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.812927 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.812937 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.812967 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.812981 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:25Z","lastTransitionTime":"2025-11-29T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.915250 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.915424 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.915485 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.915551 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:25 crc kubenswrapper[4594]: I1129 05:28:25.915609 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:25Z","lastTransitionTime":"2025-11-29T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.017601 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.017637 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.017646 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.017662 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.017673 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.083434 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.083464 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.083470 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:26 crc kubenswrapper[4594]: E1129 05:28:26.083595 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:26 crc kubenswrapper[4594]: E1129 05:28:26.083704 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:26 crc kubenswrapper[4594]: E1129 05:28:26.083796 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.096104 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.105655 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.115378 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.119331 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.119368 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.119381 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.119401 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.119413 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.126157 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.136805 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.150377 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"message\\\":\\\"shutting down\\\\nI1129 05:28:24.803701 5946 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.803238 5946 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1129 05:28:24.803981 5946 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1129 05:28:24.803455 5946 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.804214 5946 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 05:28:24.803041 5946 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 05:28:24.804392 5946 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.804431 5946 factory.go:656] Stopping watch factory\\\\nI1129 05:28:24.804445 5946 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 05:28:24.804921 5946 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.805008 5946 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.162095 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.170193 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.178287 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.188888 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.201894 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.211072 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.222584 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.222705 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.222776 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.222847 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.222924 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.248636 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.277216 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/1.log" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.277903 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/0.log" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.281206 4594 generic.go:334] "Generic (PLEG): container finished" podID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerID="4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171" exitCode=1 Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.281274 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171"} Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.281398 4594 scope.go:117] "RemoveContainer" containerID="5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.281985 4594 scope.go:117] "RemoveContainer" containerID="4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171" Nov 29 05:28:26 crc kubenswrapper[4594]: E1129 05:28:26.282231 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.291056 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.325703 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.325747 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.325760 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.325782 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.325797 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.330325 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.369388 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.411089 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.428104 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.428144 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.428156 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.428177 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.428193 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.452401 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.489703 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.530049 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.530078 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.530090 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.530103 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.530112 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.532012 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.573178 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5755b8e9afa4a0d89a3361dc97a76c5baede9605f0d44ad078cd9005135e1590\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"message\\\":\\\"shutting down\\\\nI1129 05:28:24.803701 5946 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.803238 5946 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1129 05:28:24.803981 5946 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1129 05:28:24.803455 5946 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.804214 5946 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 05:28:24.803041 5946 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 05:28:24.804392 5946 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.804431 5946 factory.go:656] Stopping watch factory\\\\nI1129 05:28:24.804445 5946 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 05:28:24.804921 5946 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 05:28:24.805008 5946 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"message\\\":\\\"10 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 05:28:25.959460 6110 services_controller.go:356] Processing sync for service openshift-service-ca-operator/metrics for network=default\\\\nI1129 05:28:25.959462 6110 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1129 05:28:25.959481 6110 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1129 05:28:25.959494 6110 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1129 05:28:25.959445 6110 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.609887 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.632407 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.632449 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.632464 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.632487 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.632502 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.651240 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.673441 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.673483 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.673498 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.673519 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.673533 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: E1129 05:28:26.683299 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.686637 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.686674 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.686685 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.686704 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.686718 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.690072 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: E1129 05:28:26.696352 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.699221 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.699273 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.699285 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.699296 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.699306 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: E1129 05:28:26.708654 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.711788 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.711832 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.711848 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.711862 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.711871 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: E1129 05:28:26.721360 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.724077 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.724110 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.724119 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.724132 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.724140 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.728011 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: E1129 05:28:26.735978 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: E1129 05:28:26.736110 4594 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.737609 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.737643 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.737652 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.737664 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.737674 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.770074 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.809507 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.839817 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.839846 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.839854 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.839864 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.839871 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.847225 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:26Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.942692 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.942727 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.942738 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.942750 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:26 crc kubenswrapper[4594]: I1129 05:28:26.942761 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:26Z","lastTransitionTime":"2025-11-29T05:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.045064 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.045116 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.045128 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.045140 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.045149 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:27Z","lastTransitionTime":"2025-11-29T05:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.147510 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.147560 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.147571 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.147588 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.147601 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:27Z","lastTransitionTime":"2025-11-29T05:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.249129 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.249177 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.249185 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.249202 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.249214 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:27Z","lastTransitionTime":"2025-11-29T05:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.285945 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/1.log" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.290294 4594 scope.go:117] "RemoveContainer" containerID="4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171" Nov 29 05:28:27 crc kubenswrapper[4594]: E1129 05:28:27.290490 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.300658 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.311544 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.319763 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.327853 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.337470 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.346087 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.351240 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.351330 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.351351 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.351380 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.351400 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:27Z","lastTransitionTime":"2025-11-29T05:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.354000 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.364700 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.374158 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.384391 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.393795 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.403102 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.416530 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"message\\\":\\\"10 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 05:28:25.959460 6110 services_controller.go:356] Processing sync for service openshift-service-ca-operator/metrics for network=default\\\\nI1129 05:28:25.959462 6110 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1129 05:28:25.959481 6110 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1129 05:28:25.959494 6110 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1129 05:28:25.959445 6110 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.426516 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:27Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.453386 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.453425 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.453439 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.453464 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.453478 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:27Z","lastTransitionTime":"2025-11-29T05:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.555600 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.555635 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.555648 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.555668 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.555683 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:27Z","lastTransitionTime":"2025-11-29T05:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.657906 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.657943 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.657967 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.657984 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.657999 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:27Z","lastTransitionTime":"2025-11-29T05:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.759800 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.759837 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.759849 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.759862 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.759871 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:27Z","lastTransitionTime":"2025-11-29T05:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.861624 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.861650 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.861661 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.861675 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.861690 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:27Z","lastTransitionTime":"2025-11-29T05:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.963771 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.963806 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.963819 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.963829 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:27 crc kubenswrapper[4594]: I1129 05:28:27.963838 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:27Z","lastTransitionTime":"2025-11-29T05:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.065563 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.065592 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.065600 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.065611 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.065643 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:28Z","lastTransitionTime":"2025-11-29T05:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.082716 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.082764 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.082786 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:28 crc kubenswrapper[4594]: E1129 05:28:28.082848 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:28 crc kubenswrapper[4594]: E1129 05:28:28.082999 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:28 crc kubenswrapper[4594]: E1129 05:28:28.083305 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.167389 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.167424 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.167434 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.167450 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.167460 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:28Z","lastTransitionTime":"2025-11-29T05:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.270044 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.270077 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.270088 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.270103 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.270114 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:28Z","lastTransitionTime":"2025-11-29T05:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.371619 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.371651 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.371660 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.371672 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.371680 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:28Z","lastTransitionTime":"2025-11-29T05:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.474103 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.474138 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.474151 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.474164 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.474173 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:28Z","lastTransitionTime":"2025-11-29T05:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.576211 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.576334 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.576348 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.576367 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.576377 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:28Z","lastTransitionTime":"2025-11-29T05:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.679307 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.679335 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.679347 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.679358 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.679366 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:28Z","lastTransitionTime":"2025-11-29T05:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.781442 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.781485 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.781494 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.781514 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.781527 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:28Z","lastTransitionTime":"2025-11-29T05:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.883435 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.883462 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.883472 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.883486 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.883495 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:28Z","lastTransitionTime":"2025-11-29T05:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.985351 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.985388 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.985402 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.985414 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:28 crc kubenswrapper[4594]: I1129 05:28:28.985422 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:28Z","lastTransitionTime":"2025-11-29T05:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.087950 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.088004 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.088015 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.088031 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.088063 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:29Z","lastTransitionTime":"2025-11-29T05:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.190431 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.190459 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.190469 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.190481 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.190491 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:29Z","lastTransitionTime":"2025-11-29T05:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.292622 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.292658 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.292675 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.292690 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.292701 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:29Z","lastTransitionTime":"2025-11-29T05:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.395169 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.395222 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.395262 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.395285 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.395297 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:29Z","lastTransitionTime":"2025-11-29T05:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.497798 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.497841 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.497856 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.497874 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.497890 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:29Z","lastTransitionTime":"2025-11-29T05:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.600273 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.600312 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.600322 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.600340 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.600350 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:29Z","lastTransitionTime":"2025-11-29T05:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.702402 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.702435 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.702445 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.702457 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.702465 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:29Z","lastTransitionTime":"2025-11-29T05:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.804511 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.804545 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.804555 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.804566 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.804574 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:29Z","lastTransitionTime":"2025-11-29T05:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.871870 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.871998 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:29 crc kubenswrapper[4594]: E1129 05:28:29.872022 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:28:45.871997113 +0000 UTC m=+50.112506333 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.872099 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:29 crc kubenswrapper[4594]: E1129 05:28:29.872144 4594 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:29 crc kubenswrapper[4594]: E1129 05:28:29.872199 4594 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:29 crc kubenswrapper[4594]: E1129 05:28:29.872220 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:45.872200854 +0000 UTC m=+50.112710084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:29 crc kubenswrapper[4594]: E1129 05:28:29.872245 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:45.872237903 +0000 UTC m=+50.112747133 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.906526 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.906578 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.906631 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.906653 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.906662 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:29Z","lastTransitionTime":"2025-11-29T05:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.973104 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:29 crc kubenswrapper[4594]: I1129 05:28:29.973140 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:29 crc kubenswrapper[4594]: E1129 05:28:29.973244 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:29 crc kubenswrapper[4594]: E1129 05:28:29.973274 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:29 crc kubenswrapper[4594]: E1129 05:28:29.973285 4594 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:29 crc kubenswrapper[4594]: E1129 05:28:29.973322 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:45.973314094 +0000 UTC m=+50.213823314 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:29 crc kubenswrapper[4594]: E1129 05:28:29.973410 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:29 crc kubenswrapper[4594]: E1129 05:28:29.973455 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:29 crc kubenswrapper[4594]: E1129 05:28:29.973471 4594 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:29 crc kubenswrapper[4594]: E1129 05:28:29.973530 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:45.973508649 +0000 UTC m=+50.214017879 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.008362 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.008394 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.008404 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.008421 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.008434 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:30Z","lastTransitionTime":"2025-11-29T05:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.075368 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.082633 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.082728 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:30 crc kubenswrapper[4594]: E1129 05:28:30.082836 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.082979 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:30 crc kubenswrapper[4594]: E1129 05:28:30.083086 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:30 crc kubenswrapper[4594]: E1129 05:28:30.083301 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.086338 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.095996 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.110152 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.110231 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.110244 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.110292 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.110307 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:30Z","lastTransitionTime":"2025-11-29T05:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.110996 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"message\\\":\\\"10 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 05:28:25.959460 6110 services_controller.go:356] Processing sync for service openshift-service-ca-operator/metrics for network=default\\\\nI1129 05:28:25.959462 6110 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1129 05:28:25.959481 6110 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1129 05:28:25.959494 6110 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1129 05:28:25.959445 6110 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.121531 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.130849 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.138653 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.145312 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.154508 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.163133 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.170057 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.179653 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.188376 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.198107 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.206358 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.213212 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.213269 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.213283 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.213297 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.213306 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:30Z","lastTransitionTime":"2025-11-29T05:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.315357 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.315387 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.315396 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.315410 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.315420 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:30Z","lastTransitionTime":"2025-11-29T05:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.334654 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz"] Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.335208 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.337241 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.337400 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.346341 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.360105 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"message\\\":\\\"10 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 05:28:25.959460 6110 services_controller.go:356] Processing sync for service openshift-service-ca-operator/metrics for network=default\\\\nI1129 05:28:25.959462 6110 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1129 05:28:25.959481 6110 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1129 05:28:25.959494 6110 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1129 05:28:25.959445 6110 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.370148 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.376219 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/240291e8-d7ec-4fc7-919f-e082a6890e2d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vn6fz\" (UID: \"240291e8-d7ec-4fc7-919f-e082a6890e2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.376275 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/240291e8-d7ec-4fc7-919f-e082a6890e2d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vn6fz\" (UID: \"240291e8-d7ec-4fc7-919f-e082a6890e2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.376312 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x25zr\" (UniqueName: \"kubernetes.io/projected/240291e8-d7ec-4fc7-919f-e082a6890e2d-kube-api-access-x25zr\") pod \"ovnkube-control-plane-749d76644c-vn6fz\" (UID: \"240291e8-d7ec-4fc7-919f-e082a6890e2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.376357 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/240291e8-d7ec-4fc7-919f-e082a6890e2d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vn6fz\" (UID: \"240291e8-d7ec-4fc7-919f-e082a6890e2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.380391 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.387698 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.396360 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.405221 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.413862 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.417134 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.417167 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.417179 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.417195 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.417205 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:30Z","lastTransitionTime":"2025-11-29T05:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.421722 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.432805 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.441948 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.449873 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.458244 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.466701 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.476073 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:30Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.477641 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x25zr\" (UniqueName: \"kubernetes.io/projected/240291e8-d7ec-4fc7-919f-e082a6890e2d-kube-api-access-x25zr\") pod \"ovnkube-control-plane-749d76644c-vn6fz\" (UID: \"240291e8-d7ec-4fc7-919f-e082a6890e2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.477729 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/240291e8-d7ec-4fc7-919f-e082a6890e2d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vn6fz\" (UID: \"240291e8-d7ec-4fc7-919f-e082a6890e2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.477775 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/240291e8-d7ec-4fc7-919f-e082a6890e2d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vn6fz\" (UID: \"240291e8-d7ec-4fc7-919f-e082a6890e2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.477800 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/240291e8-d7ec-4fc7-919f-e082a6890e2d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vn6fz\" (UID: \"240291e8-d7ec-4fc7-919f-e082a6890e2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.478783 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/240291e8-d7ec-4fc7-919f-e082a6890e2d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vn6fz\" (UID: \"240291e8-d7ec-4fc7-919f-e082a6890e2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.478860 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/240291e8-d7ec-4fc7-919f-e082a6890e2d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vn6fz\" (UID: \"240291e8-d7ec-4fc7-919f-e082a6890e2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.484455 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/240291e8-d7ec-4fc7-919f-e082a6890e2d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vn6fz\" (UID: \"240291e8-d7ec-4fc7-919f-e082a6890e2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.490394 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x25zr\" (UniqueName: \"kubernetes.io/projected/240291e8-d7ec-4fc7-919f-e082a6890e2d-kube-api-access-x25zr\") pod \"ovnkube-control-plane-749d76644c-vn6fz\" (UID: \"240291e8-d7ec-4fc7-919f-e082a6890e2d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.519182 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.519216 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.519228 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.519242 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.519272 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:30Z","lastTransitionTime":"2025-11-29T05:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.621474 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.621512 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.621523 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.621548 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.621561 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:30Z","lastTransitionTime":"2025-11-29T05:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.648710 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" Nov 29 05:28:30 crc kubenswrapper[4594]: W1129 05:28:30.660823 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod240291e8_d7ec_4fc7_919f_e082a6890e2d.slice/crio-a72bdec7933692368edc25e7cc6a54326e39df73c5978c7b43a23c5f3d0a0256 WatchSource:0}: Error finding container a72bdec7933692368edc25e7cc6a54326e39df73c5978c7b43a23c5f3d0a0256: Status 404 returned error can't find the container with id a72bdec7933692368edc25e7cc6a54326e39df73c5978c7b43a23c5f3d0a0256 Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.723600 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.723634 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.723642 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.723657 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.723667 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:30Z","lastTransitionTime":"2025-11-29T05:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.826315 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.826350 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.826362 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.826381 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.826392 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:30Z","lastTransitionTime":"2025-11-29T05:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.929758 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.929810 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.929825 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.929848 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:30 crc kubenswrapper[4594]: I1129 05:28:30.929863 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:30Z","lastTransitionTime":"2025-11-29T05:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.032807 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.032851 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.032862 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.032879 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.032891 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:31Z","lastTransitionTime":"2025-11-29T05:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.049567 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-lzr56"] Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.049960 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:31 crc kubenswrapper[4594]: E1129 05:28:31.050026 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.060744 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.071187 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.081072 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.081445 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.081479 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8gwz\" (UniqueName: \"kubernetes.io/projected/217088b9-a48b-40c7-8d83-f9ff0eb24908-kube-api-access-v8gwz\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.089766 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.099672 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.108961 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.116605 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.131389 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.135032 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.135065 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.135077 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.135092 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.135106 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:31Z","lastTransitionTime":"2025-11-29T05:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.153839 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.168383 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.182050 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.182082 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8gwz\" (UniqueName: \"kubernetes.io/projected/217088b9-a48b-40c7-8d83-f9ff0eb24908-kube-api-access-v8gwz\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:31 crc kubenswrapper[4594]: E1129 05:28:31.182194 4594 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:28:31 crc kubenswrapper[4594]: E1129 05:28:31.182238 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs podName:217088b9-a48b-40c7-8d83-f9ff0eb24908 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:31.682226865 +0000 UTC m=+35.922736085 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs") pod "network-metrics-daemon-lzr56" (UID: "217088b9-a48b-40c7-8d83-f9ff0eb24908") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.185395 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.195664 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.197856 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8gwz\" (UniqueName: \"kubernetes.io/projected/217088b9-a48b-40c7-8d83-f9ff0eb24908-kube-api-access-v8gwz\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.204999 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.214445 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.228110 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"message\\\":\\\"10 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 05:28:25.959460 6110 services_controller.go:356] Processing sync for service openshift-service-ca-operator/metrics for network=default\\\\nI1129 05:28:25.959462 6110 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1129 05:28:25.959481 6110 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1129 05:28:25.959494 6110 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1129 05:28:25.959445 6110 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.237890 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.237921 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.237941 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.237958 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.237971 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:31Z","lastTransitionTime":"2025-11-29T05:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.239195 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.304566 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" event={"ID":"240291e8-d7ec-4fc7-919f-e082a6890e2d","Type":"ContainerStarted","Data":"af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2"} Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.304782 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" event={"ID":"240291e8-d7ec-4fc7-919f-e082a6890e2d","Type":"ContainerStarted","Data":"1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0"} Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.304800 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" event={"ID":"240291e8-d7ec-4fc7-919f-e082a6890e2d","Type":"ContainerStarted","Data":"a72bdec7933692368edc25e7cc6a54326e39df73c5978c7b43a23c5f3d0a0256"} Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.316918 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.326560 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.339967 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.340007 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.340017 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.340033 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.340044 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:31Z","lastTransitionTime":"2025-11-29T05:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.341716 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"message\\\":\\\"10 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 05:28:25.959460 6110 services_controller.go:356] Processing sync for service openshift-service-ca-operator/metrics for network=default\\\\nI1129 05:28:25.959462 6110 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1129 05:28:25.959481 6110 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1129 05:28:25.959494 6110 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1129 05:28:25.959445 6110 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.352639 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.361267 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.369635 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.379512 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.389718 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.399581 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.409057 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.424503 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.435217 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.442660 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.442700 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.442712 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.442729 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.442744 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:31Z","lastTransitionTime":"2025-11-29T05:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.446218 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.454424 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.463828 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.473563 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:31Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.544831 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.544871 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.544884 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.544903 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.544923 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:31Z","lastTransitionTime":"2025-11-29T05:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.647568 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.647621 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.647633 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.647651 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.647662 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:31Z","lastTransitionTime":"2025-11-29T05:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.688213 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:31 crc kubenswrapper[4594]: E1129 05:28:31.688359 4594 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:28:31 crc kubenswrapper[4594]: E1129 05:28:31.688435 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs podName:217088b9-a48b-40c7-8d83-f9ff0eb24908 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:32.688418044 +0000 UTC m=+36.928927265 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs") pod "network-metrics-daemon-lzr56" (UID: "217088b9-a48b-40c7-8d83-f9ff0eb24908") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.750047 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.750084 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.750097 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.750117 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.750131 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:31Z","lastTransitionTime":"2025-11-29T05:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.852625 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.852666 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.852676 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.852691 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.852703 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:31Z","lastTransitionTime":"2025-11-29T05:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.954637 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.954681 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.954690 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.954704 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:31 crc kubenswrapper[4594]: I1129 05:28:31.954719 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:31Z","lastTransitionTime":"2025-11-29T05:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.056744 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.056783 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.056793 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.056809 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.056822 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:32Z","lastTransitionTime":"2025-11-29T05:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.082551 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.082551 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:32 crc kubenswrapper[4594]: E1129 05:28:32.082680 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.082551 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:32 crc kubenswrapper[4594]: E1129 05:28:32.082756 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:32 crc kubenswrapper[4594]: E1129 05:28:32.082845 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.158600 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.158635 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.158646 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.158659 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.158670 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:32Z","lastTransitionTime":"2025-11-29T05:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.260464 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.260503 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.260515 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.260530 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.260543 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:32Z","lastTransitionTime":"2025-11-29T05:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.362857 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.362881 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.362890 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.362901 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.362909 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:32Z","lastTransitionTime":"2025-11-29T05:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.465354 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.465410 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.465424 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.465445 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.465458 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:32Z","lastTransitionTime":"2025-11-29T05:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.567296 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.567353 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.567365 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.567380 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.567390 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:32Z","lastTransitionTime":"2025-11-29T05:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.669607 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.669648 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.669657 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.669675 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.669686 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:32Z","lastTransitionTime":"2025-11-29T05:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.697349 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:32 crc kubenswrapper[4594]: E1129 05:28:32.697601 4594 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:28:32 crc kubenswrapper[4594]: E1129 05:28:32.697705 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs podName:217088b9-a48b-40c7-8d83-f9ff0eb24908 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:34.697678422 +0000 UTC m=+38.938187652 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs") pod "network-metrics-daemon-lzr56" (UID: "217088b9-a48b-40c7-8d83-f9ff0eb24908") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.772135 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.772169 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.772180 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.772196 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.772205 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:32Z","lastTransitionTime":"2025-11-29T05:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.874807 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.874838 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.874849 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.874861 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.874876 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:32Z","lastTransitionTime":"2025-11-29T05:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.976730 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.976777 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.976794 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.976816 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:32 crc kubenswrapper[4594]: I1129 05:28:32.976831 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:32Z","lastTransitionTime":"2025-11-29T05:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.078937 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.078992 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.079001 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.079017 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.079036 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:33Z","lastTransitionTime":"2025-11-29T05:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.083338 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:33 crc kubenswrapper[4594]: E1129 05:28:33.083463 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.181075 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.181110 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.181121 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.181135 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.181143 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:33Z","lastTransitionTime":"2025-11-29T05:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.283212 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.283313 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.283324 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.283336 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.283345 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:33Z","lastTransitionTime":"2025-11-29T05:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.385825 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.385866 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.385876 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.385892 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.385909 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:33Z","lastTransitionTime":"2025-11-29T05:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.487642 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.487675 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.487685 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.487698 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.487707 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:33Z","lastTransitionTime":"2025-11-29T05:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.590910 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.590958 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.590969 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.590982 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.590992 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:33Z","lastTransitionTime":"2025-11-29T05:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.692693 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.692725 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.692735 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.692749 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.692758 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:33Z","lastTransitionTime":"2025-11-29T05:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.794895 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.794947 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.794957 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.794972 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.794984 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:33Z","lastTransitionTime":"2025-11-29T05:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.897070 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.897098 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.897107 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.897124 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.897136 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:33Z","lastTransitionTime":"2025-11-29T05:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.999161 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.999184 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.999193 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.999202 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:33 crc kubenswrapper[4594]: I1129 05:28:33.999210 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:33Z","lastTransitionTime":"2025-11-29T05:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.083538 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.083538 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.083741 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:34 crc kubenswrapper[4594]: E1129 05:28:34.083656 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:34 crc kubenswrapper[4594]: E1129 05:28:34.083904 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:34 crc kubenswrapper[4594]: E1129 05:28:34.084211 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.101123 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.101156 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.101165 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.101176 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.101186 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:34Z","lastTransitionTime":"2025-11-29T05:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.203292 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.203329 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.203339 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.203353 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.203363 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:34Z","lastTransitionTime":"2025-11-29T05:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.305611 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.305644 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.305654 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.305666 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.305674 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:34Z","lastTransitionTime":"2025-11-29T05:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.406847 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.406875 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.406895 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.406906 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.406929 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:34Z","lastTransitionTime":"2025-11-29T05:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.509146 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.509174 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.509184 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.509195 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.509203 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:34Z","lastTransitionTime":"2025-11-29T05:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.611306 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.611330 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.611340 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.611351 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.611360 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:34Z","lastTransitionTime":"2025-11-29T05:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.713347 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.713374 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.713386 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.713398 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.713406 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:34Z","lastTransitionTime":"2025-11-29T05:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.714878 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:34 crc kubenswrapper[4594]: E1129 05:28:34.715006 4594 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:28:34 crc kubenswrapper[4594]: E1129 05:28:34.715067 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs podName:217088b9-a48b-40c7-8d83-f9ff0eb24908 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:38.715051083 +0000 UTC m=+42.955560303 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs") pod "network-metrics-daemon-lzr56" (UID: "217088b9-a48b-40c7-8d83-f9ff0eb24908") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.815681 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.815713 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.815724 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.815735 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.815743 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:34Z","lastTransitionTime":"2025-11-29T05:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.917705 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.917735 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.917746 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.917756 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:34 crc kubenswrapper[4594]: I1129 05:28:34.917763 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:34Z","lastTransitionTime":"2025-11-29T05:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.019232 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.019276 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.019284 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.019295 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.019305 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:35Z","lastTransitionTime":"2025-11-29T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.083169 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:35 crc kubenswrapper[4594]: E1129 05:28:35.083322 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.121582 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.121603 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.121612 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.121624 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.121635 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:35Z","lastTransitionTime":"2025-11-29T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.222954 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.222983 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.222993 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.223005 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.223013 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:35Z","lastTransitionTime":"2025-11-29T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.324796 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.324854 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.324870 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.324889 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.324901 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:35Z","lastTransitionTime":"2025-11-29T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.426684 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.426716 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.426724 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.426735 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.426743 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:35Z","lastTransitionTime":"2025-11-29T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.528777 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.528813 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.528822 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.528835 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.528846 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:35Z","lastTransitionTime":"2025-11-29T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.631030 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.631078 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.631087 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.631101 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.631111 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:35Z","lastTransitionTime":"2025-11-29T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.732922 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.732961 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.732972 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.732987 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.733001 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:35Z","lastTransitionTime":"2025-11-29T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.835605 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.835642 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.835654 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.835669 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.835680 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:35Z","lastTransitionTime":"2025-11-29T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.937817 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.937871 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.937880 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.937900 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:35 crc kubenswrapper[4594]: I1129 05:28:35.937923 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:35Z","lastTransitionTime":"2025-11-29T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.040127 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.040172 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.040181 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.040197 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.040208 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.082586 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.082611 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.082762 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:36 crc kubenswrapper[4594]: E1129 05:28:36.082915 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:36 crc kubenswrapper[4594]: E1129 05:28:36.083072 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:36 crc kubenswrapper[4594]: E1129 05:28:36.083128 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.094780 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.104182 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.113194 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.122892 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.131888 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.141839 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.141867 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.141878 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.141890 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.141911 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.143323 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.158834 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"message\\\":\\\"10 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 05:28:25.959460 6110 services_controller.go:356] Processing sync for service openshift-service-ca-operator/metrics for network=default\\\\nI1129 05:28:25.959462 6110 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1129 05:28:25.959481 6110 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1129 05:28:25.959494 6110 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1129 05:28:25.959445 6110 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.170808 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.180378 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.189387 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.199291 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.208834 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.217917 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.227048 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.236760 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.243450 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.243482 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.243495 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.243513 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.243525 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.246787 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.345463 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.345503 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.345514 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.345534 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.345546 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.448372 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.448411 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.448420 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.448436 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.448449 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.551341 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.551387 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.551396 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.551413 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.551426 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.653496 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.653532 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.653543 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.653554 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.653564 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.755638 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.755683 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.755692 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.755711 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.755723 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.811890 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.811937 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.811949 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.811960 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.811968 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: E1129 05:28:36.822339 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.826421 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.826449 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.826461 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.826476 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.826486 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: E1129 05:28:36.836065 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.838957 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.838992 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.839004 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.839015 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.839024 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: E1129 05:28:36.848468 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.851381 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.851423 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.851437 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.851452 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.851465 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: E1129 05:28:36.860830 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.863797 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.863834 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.863846 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.863857 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.863867 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: E1129 05:28:36.873075 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:36Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:36 crc kubenswrapper[4594]: E1129 05:28:36.873192 4594 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.874557 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.874616 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.874630 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.874643 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.874656 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.976947 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.977006 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.977019 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.977036 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:36 crc kubenswrapper[4594]: I1129 05:28:36.977046 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:36Z","lastTransitionTime":"2025-11-29T05:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.078918 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.078945 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.078952 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.078964 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.078971 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:37Z","lastTransitionTime":"2025-11-29T05:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.082623 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:37 crc kubenswrapper[4594]: E1129 05:28:37.082753 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.181496 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.181523 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.181534 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.181548 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.181556 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:37Z","lastTransitionTime":"2025-11-29T05:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.283078 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.283116 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.283125 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.283140 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.283152 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:37Z","lastTransitionTime":"2025-11-29T05:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.385772 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.385823 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.385836 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.385859 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.385878 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:37Z","lastTransitionTime":"2025-11-29T05:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.488202 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.488343 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.488457 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.488521 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.488579 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:37Z","lastTransitionTime":"2025-11-29T05:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.590479 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.590509 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.590518 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.590531 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.590543 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:37Z","lastTransitionTime":"2025-11-29T05:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.692511 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.692554 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.692564 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.692579 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.692590 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:37Z","lastTransitionTime":"2025-11-29T05:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.794121 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.794151 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.794161 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.794172 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.794182 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:37Z","lastTransitionTime":"2025-11-29T05:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.895841 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.895885 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.895904 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.895939 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.895951 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:37Z","lastTransitionTime":"2025-11-29T05:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.997362 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.998034 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.998085 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.998116 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:37 crc kubenswrapper[4594]: I1129 05:28:37.998133 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:37Z","lastTransitionTime":"2025-11-29T05:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.083334 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.083354 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:38 crc kubenswrapper[4594]: E1129 05:28:38.083532 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.083372 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:38 crc kubenswrapper[4594]: E1129 05:28:38.083580 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:38 crc kubenswrapper[4594]: E1129 05:28:38.083675 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.099774 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.099811 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.099822 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.099833 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.099845 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:38Z","lastTransitionTime":"2025-11-29T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.202197 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.202232 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.202242 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.202282 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.202299 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:38Z","lastTransitionTime":"2025-11-29T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.304109 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.304146 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.304159 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.304172 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.304184 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:38Z","lastTransitionTime":"2025-11-29T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.406381 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.406406 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.406415 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.406427 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.406435 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:38Z","lastTransitionTime":"2025-11-29T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.508790 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.508822 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.508832 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.508844 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.508863 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:38Z","lastTransitionTime":"2025-11-29T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.611033 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.611056 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.611065 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.611078 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.611087 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:38Z","lastTransitionTime":"2025-11-29T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.713204 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.713246 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.713280 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.713303 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.713325 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:38Z","lastTransitionTime":"2025-11-29T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.749861 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:38 crc kubenswrapper[4594]: E1129 05:28:38.750007 4594 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:28:38 crc kubenswrapper[4594]: E1129 05:28:38.750063 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs podName:217088b9-a48b-40c7-8d83-f9ff0eb24908 nodeName:}" failed. No retries permitted until 2025-11-29 05:28:46.750050342 +0000 UTC m=+50.990559561 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs") pod "network-metrics-daemon-lzr56" (UID: "217088b9-a48b-40c7-8d83-f9ff0eb24908") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.815960 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.815994 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.816004 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.816018 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.816030 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:38Z","lastTransitionTime":"2025-11-29T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.918142 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.918182 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.918193 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.918209 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:38 crc kubenswrapper[4594]: I1129 05:28:38.918218 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:38Z","lastTransitionTime":"2025-11-29T05:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.020335 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.020372 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.020391 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.020410 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.020420 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:39Z","lastTransitionTime":"2025-11-29T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.083540 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:39 crc kubenswrapper[4594]: E1129 05:28:39.083697 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.122479 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.122505 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.122516 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.122527 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.122535 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:39Z","lastTransitionTime":"2025-11-29T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.224866 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.224895 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.224904 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.224914 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.224921 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:39Z","lastTransitionTime":"2025-11-29T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.326760 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.326794 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.326803 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.326813 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.326826 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:39Z","lastTransitionTime":"2025-11-29T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.428860 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.428923 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.428934 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.428959 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.428972 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:39Z","lastTransitionTime":"2025-11-29T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.530835 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.530860 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.530868 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.530877 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.530891 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:39Z","lastTransitionTime":"2025-11-29T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.633198 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.633231 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.633242 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.633280 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.633293 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:39Z","lastTransitionTime":"2025-11-29T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.735588 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.735640 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.735649 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.735670 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.735683 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:39Z","lastTransitionTime":"2025-11-29T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.837636 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.837665 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.837673 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.837701 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.837711 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:39Z","lastTransitionTime":"2025-11-29T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.940136 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.940177 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.940188 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.940202 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:39 crc kubenswrapper[4594]: I1129 05:28:39.940213 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:39Z","lastTransitionTime":"2025-11-29T05:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.042145 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.042171 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.042179 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.042191 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.042199 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:40Z","lastTransitionTime":"2025-11-29T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.083088 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.083097 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.083160 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:40 crc kubenswrapper[4594]: E1129 05:28:40.083197 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:40 crc kubenswrapper[4594]: E1129 05:28:40.083328 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:40 crc kubenswrapper[4594]: E1129 05:28:40.083486 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.144000 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.144039 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.144048 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.144062 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.144076 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:40Z","lastTransitionTime":"2025-11-29T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.246490 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.246528 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.246539 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.246555 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.246566 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:40Z","lastTransitionTime":"2025-11-29T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.348222 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.348283 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.348295 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.348306 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.348313 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:40Z","lastTransitionTime":"2025-11-29T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.450330 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.450362 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.450371 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.450387 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.450396 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:40Z","lastTransitionTime":"2025-11-29T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.551973 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.552007 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.552017 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.552027 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.552037 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:40Z","lastTransitionTime":"2025-11-29T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.654456 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.654545 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.654564 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.654590 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.654604 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:40Z","lastTransitionTime":"2025-11-29T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.756745 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.756782 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.756792 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.756805 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.756816 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:40Z","lastTransitionTime":"2025-11-29T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.859496 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.859563 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.859576 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.859602 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.859620 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:40Z","lastTransitionTime":"2025-11-29T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.961556 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.961590 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.961603 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.961617 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:40 crc kubenswrapper[4594]: I1129 05:28:40.961627 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:40Z","lastTransitionTime":"2025-11-29T05:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.064854 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.064907 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.064921 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.064939 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.064951 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:41Z","lastTransitionTime":"2025-11-29T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.083338 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:41 crc kubenswrapper[4594]: E1129 05:28:41.083471 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.084343 4594 scope.go:117] "RemoveContainer" containerID="4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.168144 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.168369 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.168380 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.168396 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.168407 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:41Z","lastTransitionTime":"2025-11-29T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.270107 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.270169 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.270188 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.270210 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.270226 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:41Z","lastTransitionTime":"2025-11-29T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.336915 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/1.log" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.340155 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerStarted","Data":"54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a"} Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.340680 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.351617 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.366225 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.371980 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.372018 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.372030 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.372047 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.372062 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:41Z","lastTransitionTime":"2025-11-29T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.378087 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.387588 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.393995 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.401982 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.416095 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.428986 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.441980 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.457929 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"message\\\":\\\"10 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 05:28:25.959460 6110 services_controller.go:356] Processing sync for service openshift-service-ca-operator/metrics for network=default\\\\nI1129 05:28:25.959462 6110 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1129 05:28:25.959481 6110 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1129 05:28:25.959494 6110 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1129 05:28:25.959445 6110 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.469582 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.474103 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.474134 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.474143 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.474160 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.474169 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:41Z","lastTransitionTime":"2025-11-29T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.478425 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.487822 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.498588 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.508982 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.518003 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.576786 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.576829 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.576850 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.576885 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.576899 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:41Z","lastTransitionTime":"2025-11-29T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.679585 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.679631 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.679640 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.679658 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.679668 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:41Z","lastTransitionTime":"2025-11-29T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.781793 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.781831 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.781840 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.781856 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.781864 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:41Z","lastTransitionTime":"2025-11-29T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.883845 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.883892 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.883903 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.883916 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.883926 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:41Z","lastTransitionTime":"2025-11-29T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.986224 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.986300 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.986312 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.986340 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:41 crc kubenswrapper[4594]: I1129 05:28:41.986354 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:41Z","lastTransitionTime":"2025-11-29T05:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.083605 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.083635 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.083734 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:42 crc kubenswrapper[4594]: E1129 05:28:42.083848 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:42 crc kubenswrapper[4594]: E1129 05:28:42.084132 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:42 crc kubenswrapper[4594]: E1129 05:28:42.084027 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.091541 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.091576 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.091595 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.091630 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.091644 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:42Z","lastTransitionTime":"2025-11-29T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.194791 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.194843 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.194853 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.194866 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.194884 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:42Z","lastTransitionTime":"2025-11-29T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.297559 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.298103 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.298181 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.298268 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.298342 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:42Z","lastTransitionTime":"2025-11-29T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.344398 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/2.log" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.345017 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/1.log" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.347404 4594 generic.go:334] "Generic (PLEG): container finished" podID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerID="54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a" exitCode=1 Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.347464 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a"} Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.347528 4594 scope.go:117] "RemoveContainer" containerID="4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.348212 4594 scope.go:117] "RemoveContainer" containerID="54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a" Nov 29 05:28:42 crc kubenswrapper[4594]: E1129 05:28:42.348447 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.363445 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.373893 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.382167 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.393550 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.401088 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.401129 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.401162 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.401185 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.401217 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:42Z","lastTransitionTime":"2025-11-29T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.405841 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.413919 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.425048 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.435780 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.446423 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.456588 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.465474 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.474304 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.483595 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.492937 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.504479 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.504523 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.504538 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.504560 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.504573 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:42Z","lastTransitionTime":"2025-11-29T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.506655 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4be9e9d41c9c964659c66458605455733afcc76dfcbf74fe74f5dff0e99ff171\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"message\\\":\\\"10 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 05:28:25.959460 6110 services_controller.go:356] Processing sync for service openshift-service-ca-operator/metrics for network=default\\\\nI1129 05:28:25.959462 6110 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1129 05:28:25.959481 6110 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1129 05:28:25.959494 6110 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1129 05:28:25.959445 6110 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:41Z\\\",\\\"message\\\":\\\"332 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-87h4n\\\\nI1129 05:28:41.803817 6332 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1129 05:28:41.803749 6332 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z]\\\\nI1129 05:28:41.803846 6332 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1129 05:28:41.803719 6332 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.517569 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.607203 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.607250 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.607276 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.607293 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.607307 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:42Z","lastTransitionTime":"2025-11-29T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.709668 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.709833 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.709901 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.709978 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.710066 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:42Z","lastTransitionTime":"2025-11-29T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.812955 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.813153 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.813217 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.813314 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.813375 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:42Z","lastTransitionTime":"2025-11-29T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.915127 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.915186 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.915197 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.915220 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:42 crc kubenswrapper[4594]: I1129 05:28:42.915236 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:42Z","lastTransitionTime":"2025-11-29T05:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.017956 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.018015 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.018028 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.018055 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.018070 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:43Z","lastTransitionTime":"2025-11-29T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.082881 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:43 crc kubenswrapper[4594]: E1129 05:28:43.083050 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.120065 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.120101 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.120112 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.120131 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.120143 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:43Z","lastTransitionTime":"2025-11-29T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.222339 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.222394 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.222409 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.222425 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.222435 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:43Z","lastTransitionTime":"2025-11-29T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.324419 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.324457 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.324468 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.324486 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.324494 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:43Z","lastTransitionTime":"2025-11-29T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.351688 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/2.log" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.357854 4594 scope.go:117] "RemoveContainer" containerID="54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a" Nov 29 05:28:43 crc kubenswrapper[4594]: E1129 05:28:43.358087 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.368764 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.380006 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.389791 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.398814 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.408115 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.417774 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.426339 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.426963 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.427001 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.427012 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.427026 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.427038 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:43Z","lastTransitionTime":"2025-11-29T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.440036 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:41Z\\\",\\\"message\\\":\\\"332 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-87h4n\\\\nI1129 05:28:41.803817 6332 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1129 05:28:41.803749 6332 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z]\\\\nI1129 05:28:41.803846 6332 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1129 05:28:41.803719 6332 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.451089 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.460354 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.471005 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.479348 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.486987 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.496755 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.505900 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.513687 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.529294 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.529325 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.529336 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.529351 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.529362 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:43Z","lastTransitionTime":"2025-11-29T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.631875 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.631902 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.631938 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.631954 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.631963 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:43Z","lastTransitionTime":"2025-11-29T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.733504 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.733544 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.733553 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.733570 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.733583 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:43Z","lastTransitionTime":"2025-11-29T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.834973 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.835013 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.835024 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.835040 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.835056 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:43Z","lastTransitionTime":"2025-11-29T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.937204 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.937303 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.937323 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.937355 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:43 crc kubenswrapper[4594]: I1129 05:28:43.937376 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:43Z","lastTransitionTime":"2025-11-29T05:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.039365 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.039416 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.039432 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.039462 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.039476 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:44Z","lastTransitionTime":"2025-11-29T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.040514 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.049803 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.051461 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.061918 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.076125 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:41Z\\\",\\\"message\\\":\\\"332 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-87h4n\\\\nI1129 05:28:41.803817 6332 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1129 05:28:41.803749 6332 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z]\\\\nI1129 05:28:41.803846 6332 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1129 05:28:41.803719 6332 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.083250 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:44 crc kubenswrapper[4594]: E1129 05:28:44.083358 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.083429 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.083463 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:44 crc kubenswrapper[4594]: E1129 05:28:44.083560 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:44 crc kubenswrapper[4594]: E1129 05:28:44.083664 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.089577 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.102599 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.112382 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.139451 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.141765 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.141813 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.141823 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.141840 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.141850 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:44Z","lastTransitionTime":"2025-11-29T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.152438 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.172020 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.183179 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.191586 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.200365 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.212507 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.223458 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.233053 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.242548 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.244225 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.244269 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.244280 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.244300 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.244313 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:44Z","lastTransitionTime":"2025-11-29T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.346303 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.346585 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.346644 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.346734 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.346809 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:44Z","lastTransitionTime":"2025-11-29T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.448678 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.448723 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.448733 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.448748 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.448760 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:44Z","lastTransitionTime":"2025-11-29T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.551275 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.551320 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.551333 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.551353 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.551366 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:44Z","lastTransitionTime":"2025-11-29T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.653508 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.653547 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.653558 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.653574 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.653588 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:44Z","lastTransitionTime":"2025-11-29T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.756318 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.756357 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.756367 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.756382 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.756393 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:44Z","lastTransitionTime":"2025-11-29T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.858695 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.858731 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.858743 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.858757 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.858770 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:44Z","lastTransitionTime":"2025-11-29T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.960587 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.960619 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.960630 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.960642 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:44 crc kubenswrapper[4594]: I1129 05:28:44.960651 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:44Z","lastTransitionTime":"2025-11-29T05:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.062706 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.062733 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.062742 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.062751 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.062760 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:45Z","lastTransitionTime":"2025-11-29T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.083355 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:45 crc kubenswrapper[4594]: E1129 05:28:45.083455 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.164473 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.164510 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.164521 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.164533 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.164542 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:45Z","lastTransitionTime":"2025-11-29T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.266925 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.266957 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.266967 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.266982 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.266992 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:45Z","lastTransitionTime":"2025-11-29T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.369360 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.369409 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.369422 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.369446 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.369459 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:45Z","lastTransitionTime":"2025-11-29T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.471397 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.471437 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.471449 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.471483 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.471493 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:45Z","lastTransitionTime":"2025-11-29T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.573419 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.573481 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.573492 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.573517 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.573540 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:45Z","lastTransitionTime":"2025-11-29T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.676557 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.676610 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.676625 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.676648 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.676661 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:45Z","lastTransitionTime":"2025-11-29T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.778361 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.778396 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.778405 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.778417 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.778428 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:45Z","lastTransitionTime":"2025-11-29T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.880158 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.880190 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.880200 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.880234 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.880243 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:45Z","lastTransitionTime":"2025-11-29T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.914450 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.914514 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.914540 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:45 crc kubenswrapper[4594]: E1129 05:28:45.914585 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:29:17.914568591 +0000 UTC m=+82.155077811 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:28:45 crc kubenswrapper[4594]: E1129 05:28:45.915048 4594 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:45 crc kubenswrapper[4594]: E1129 05:28:45.915224 4594 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:45 crc kubenswrapper[4594]: E1129 05:28:45.915245 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:29:17.915183123 +0000 UTC m=+82.155692342 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:28:45 crc kubenswrapper[4594]: E1129 05:28:45.915487 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:29:17.915449031 +0000 UTC m=+82.155958241 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.983127 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.983175 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.983184 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.983200 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:45 crc kubenswrapper[4594]: I1129 05:28:45.983212 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:45Z","lastTransitionTime":"2025-11-29T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.015826 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.015935 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:46 crc kubenswrapper[4594]: E1129 05:28:46.016058 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:46 crc kubenswrapper[4594]: E1129 05:28:46.016104 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:46 crc kubenswrapper[4594]: E1129 05:28:46.016126 4594 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:46 crc kubenswrapper[4594]: E1129 05:28:46.016127 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:28:46 crc kubenswrapper[4594]: E1129 05:28:46.016168 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:28:46 crc kubenswrapper[4594]: E1129 05:28:46.016181 4594 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:46 crc kubenswrapper[4594]: E1129 05:28:46.016210 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 05:29:18.016186897 +0000 UTC m=+82.256696137 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:46 crc kubenswrapper[4594]: E1129 05:28:46.016236 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 05:29:18.01622572 +0000 UTC m=+82.256734960 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.083468 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:46 crc kubenswrapper[4594]: E1129 05:28:46.083581 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.083713 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:46 crc kubenswrapper[4594]: E1129 05:28:46.083815 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.083945 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:46 crc kubenswrapper[4594]: E1129 05:28:46.084049 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.085134 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.085166 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.085174 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.085187 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.085195 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:46Z","lastTransitionTime":"2025-11-29T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.095092 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.104937 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.114976 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.123300 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.134019 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.144971 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.155479 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.164544 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.173936 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.184457 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.186959 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.187050 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.187126 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.187204 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.187302 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:46Z","lastTransitionTime":"2025-11-29T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.194177 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.203176 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.222894 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:41Z\\\",\\\"message\\\":\\\"332 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-87h4n\\\\nI1129 05:28:41.803817 6332 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1129 05:28:41.803749 6332 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z]\\\\nI1129 05:28:41.803846 6332 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1129 05:28:41.803719 6332 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.234027 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.242793 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cbcda3d-f02f-4e3e-9560-a1dcb0e9b659\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cdf0c76e877968d93b7b86fcdf319cfd2072ceb45903b275dfbe38db4a4e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7130e6d58c27093d1efa4f7260aba78681945df2d956da7878075ba5eb1d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe6aa22ca88502e90dc3ea058d420a497838102af8aa1afc78edc4a52bed8a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.253646 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.263978 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.289389 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.289438 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.289455 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.289478 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.289498 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:46Z","lastTransitionTime":"2025-11-29T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.391563 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.391601 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.391613 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.391632 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.391645 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:46Z","lastTransitionTime":"2025-11-29T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.493978 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.494018 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.494048 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.494067 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.494081 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:46Z","lastTransitionTime":"2025-11-29T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.596463 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.596500 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.596509 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.596522 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.596534 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:46Z","lastTransitionTime":"2025-11-29T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.698715 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.698752 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.698763 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.698781 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.698810 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:46Z","lastTransitionTime":"2025-11-29T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.800583 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.800626 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.800636 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.800648 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.800656 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:46Z","lastTransitionTime":"2025-11-29T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.823989 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:46 crc kubenswrapper[4594]: E1129 05:28:46.824134 4594 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:28:46 crc kubenswrapper[4594]: E1129 05:28:46.824189 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs podName:217088b9-a48b-40c7-8d83-f9ff0eb24908 nodeName:}" failed. No retries permitted until 2025-11-29 05:29:02.824172339 +0000 UTC m=+67.064681550 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs") pod "network-metrics-daemon-lzr56" (UID: "217088b9-a48b-40c7-8d83-f9ff0eb24908") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.903021 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.903059 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.903069 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.903081 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:46 crc kubenswrapper[4594]: I1129 05:28:46.903091 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:46Z","lastTransitionTime":"2025-11-29T05:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.005522 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.005550 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.005561 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.005577 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.005587 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.083297 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:47 crc kubenswrapper[4594]: E1129 05:28:47.084141 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.108006 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.108056 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.108068 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.108088 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.108104 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.196038 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.196064 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.196073 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.196108 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.196119 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: E1129 05:28:47.206302 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.209225 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.209286 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.209297 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.209313 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.209324 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: E1129 05:28:47.218370 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.221773 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.221807 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.221820 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.221834 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.221851 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: E1129 05:28:47.230769 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.233332 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.233384 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.233397 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.233419 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.233431 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: E1129 05:28:47.243149 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.246018 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.246099 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.246149 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.246198 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.246273 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: E1129 05:28:47.256784 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:47 crc kubenswrapper[4594]: E1129 05:28:47.257028 4594 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.258364 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.258459 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.258518 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.258569 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.258624 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.360202 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.360326 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.360409 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.360468 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.360522 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.461916 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.462025 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.462111 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.462203 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.462296 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.564424 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.564444 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.564453 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.564473 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.564480 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.666808 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.666872 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.666884 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.666898 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.666907 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.768700 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.768725 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.768736 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.768746 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.768756 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.877358 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.877398 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.877409 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.877427 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.877440 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.980026 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.980087 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.980099 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.980122 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:47 crc kubenswrapper[4594]: I1129 05:28:47.980138 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:47Z","lastTransitionTime":"2025-11-29T05:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.082228 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.082285 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.082297 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.082314 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.082325 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:48Z","lastTransitionTime":"2025-11-29T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.082565 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.082596 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:48 crc kubenswrapper[4594]: E1129 05:28:48.082688 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.082702 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:48 crc kubenswrapper[4594]: E1129 05:28:48.082831 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:48 crc kubenswrapper[4594]: E1129 05:28:48.082957 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.184466 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.184504 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.184517 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.184533 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.184548 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:48Z","lastTransitionTime":"2025-11-29T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.287046 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.287334 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.287347 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.287363 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.287377 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:48Z","lastTransitionTime":"2025-11-29T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.389482 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.389520 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.389532 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.389551 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.389564 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:48Z","lastTransitionTime":"2025-11-29T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.492051 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.492100 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.492109 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.492129 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.492141 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:48Z","lastTransitionTime":"2025-11-29T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.594225 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.594302 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.594316 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.594340 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.594353 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:48Z","lastTransitionTime":"2025-11-29T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.696727 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.696769 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.696782 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.696805 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.696818 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:48Z","lastTransitionTime":"2025-11-29T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.798576 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.798616 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.798626 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.798643 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.798662 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:48Z","lastTransitionTime":"2025-11-29T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.900464 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.900505 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.900516 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.900531 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:48 crc kubenswrapper[4594]: I1129 05:28:48.900544 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:48Z","lastTransitionTime":"2025-11-29T05:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.002941 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.002977 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.002988 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.002999 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.003011 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:49Z","lastTransitionTime":"2025-11-29T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.083504 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:49 crc kubenswrapper[4594]: E1129 05:28:49.083653 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.105501 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.105543 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.105553 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.105568 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.105579 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:49Z","lastTransitionTime":"2025-11-29T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.208066 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.208110 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.208120 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.208136 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.208146 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:49Z","lastTransitionTime":"2025-11-29T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.310917 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.310969 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.310980 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.311001 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.311016 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:49Z","lastTransitionTime":"2025-11-29T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.412899 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.412939 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.412950 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.412965 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.412976 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:49Z","lastTransitionTime":"2025-11-29T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.515512 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.515557 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.515566 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.515583 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.515597 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:49Z","lastTransitionTime":"2025-11-29T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.617322 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.617355 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.617368 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.617381 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.617390 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:49Z","lastTransitionTime":"2025-11-29T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.719227 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.719286 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.719296 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.719313 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.719324 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:49Z","lastTransitionTime":"2025-11-29T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.822387 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.822427 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.822437 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.822453 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.822463 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:49Z","lastTransitionTime":"2025-11-29T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.924824 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.924904 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.924916 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.924936 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:49 crc kubenswrapper[4594]: I1129 05:28:49.924948 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:49Z","lastTransitionTime":"2025-11-29T05:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.026970 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.026993 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.027002 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.027017 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.027043 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:50Z","lastTransitionTime":"2025-11-29T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.083453 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.083466 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:50 crc kubenswrapper[4594]: E1129 05:28:50.083558 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.083618 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:50 crc kubenswrapper[4594]: E1129 05:28:50.083695 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:50 crc kubenswrapper[4594]: E1129 05:28:50.083779 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.129218 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.129277 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.129287 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.129302 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.129314 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:50Z","lastTransitionTime":"2025-11-29T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.231581 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.231608 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.231620 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.231650 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.231658 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:50Z","lastTransitionTime":"2025-11-29T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.333896 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.333956 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.333967 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.333978 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.333986 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:50Z","lastTransitionTime":"2025-11-29T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.436331 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.436365 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.436377 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.436391 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.436401 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:50Z","lastTransitionTime":"2025-11-29T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.538467 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.538521 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.538534 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.538546 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.538557 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:50Z","lastTransitionTime":"2025-11-29T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.640737 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.640777 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.640786 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.640800 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.640811 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:50Z","lastTransitionTime":"2025-11-29T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.742363 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.742404 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.742415 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.742436 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.742450 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:50Z","lastTransitionTime":"2025-11-29T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.845123 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.845162 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.845171 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.845185 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.845198 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:50Z","lastTransitionTime":"2025-11-29T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.947098 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.947131 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.947141 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.947152 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:50 crc kubenswrapper[4594]: I1129 05:28:50.947161 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:50Z","lastTransitionTime":"2025-11-29T05:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.049448 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.049478 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.049487 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.049497 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.049505 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:51Z","lastTransitionTime":"2025-11-29T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.082590 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:51 crc kubenswrapper[4594]: E1129 05:28:51.082703 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.151690 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.151726 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.151737 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.151749 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.151758 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:51Z","lastTransitionTime":"2025-11-29T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.253977 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.254017 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.254026 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.254043 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.254053 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:51Z","lastTransitionTime":"2025-11-29T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.355808 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.355854 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.355863 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.355877 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.355885 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:51Z","lastTransitionTime":"2025-11-29T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.457330 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.457371 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.457380 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.457394 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.457405 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:51Z","lastTransitionTime":"2025-11-29T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.559207 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.559279 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.559291 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.559310 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.559321 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:51Z","lastTransitionTime":"2025-11-29T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.661272 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.661307 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.661319 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.661332 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.661343 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:51Z","lastTransitionTime":"2025-11-29T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.763020 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.763048 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.763057 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.763069 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.763111 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:51Z","lastTransitionTime":"2025-11-29T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.865319 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.865362 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.865373 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.865388 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.865399 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:51Z","lastTransitionTime":"2025-11-29T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.967417 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.967454 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.967464 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.967479 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:51 crc kubenswrapper[4594]: I1129 05:28:51.967495 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:51Z","lastTransitionTime":"2025-11-29T05:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.070046 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.070081 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.070090 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.070099 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.070109 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:52Z","lastTransitionTime":"2025-11-29T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.084319 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.084390 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:52 crc kubenswrapper[4594]: E1129 05:28:52.084454 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.084481 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:52 crc kubenswrapper[4594]: E1129 05:28:52.084563 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:52 crc kubenswrapper[4594]: E1129 05:28:52.084647 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.171947 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.171979 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.171990 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.172007 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.172020 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:52Z","lastTransitionTime":"2025-11-29T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.274136 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.274170 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.274181 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.274196 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.274207 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:52Z","lastTransitionTime":"2025-11-29T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.375799 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.375845 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.375856 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.375870 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.375879 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:52Z","lastTransitionTime":"2025-11-29T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.477284 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.477414 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.477502 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.477579 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.477642 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:52Z","lastTransitionTime":"2025-11-29T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.579652 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.579766 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.579952 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.580045 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.580107 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:52Z","lastTransitionTime":"2025-11-29T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.682568 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.682606 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.682616 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.682632 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.682645 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:52Z","lastTransitionTime":"2025-11-29T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.785244 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.785289 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.785299 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.785310 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.785320 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:52Z","lastTransitionTime":"2025-11-29T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.887590 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.887615 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.887625 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.887636 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.887646 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:52Z","lastTransitionTime":"2025-11-29T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.989464 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.989494 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.989506 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.989518 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:52 crc kubenswrapper[4594]: I1129 05:28:52.989526 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:52Z","lastTransitionTime":"2025-11-29T05:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.082954 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:53 crc kubenswrapper[4594]: E1129 05:28:53.083065 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.091431 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.091471 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.091482 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.091497 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.091507 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:53Z","lastTransitionTime":"2025-11-29T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.194045 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.194099 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.194109 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.194128 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.194139 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:53Z","lastTransitionTime":"2025-11-29T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.296210 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.296279 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.296293 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.296310 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.296325 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:53Z","lastTransitionTime":"2025-11-29T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.398288 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.398326 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.398334 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.398346 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.398357 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:53Z","lastTransitionTime":"2025-11-29T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.500657 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.500697 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.500706 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.500724 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.500735 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:53Z","lastTransitionTime":"2025-11-29T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.603725 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.603764 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.603774 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.603790 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.603801 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:53Z","lastTransitionTime":"2025-11-29T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.706160 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.706193 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.706203 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.706214 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.706224 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:53Z","lastTransitionTime":"2025-11-29T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.809243 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.809304 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.809318 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.809337 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.809351 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:53Z","lastTransitionTime":"2025-11-29T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.911531 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.911584 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.911594 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.911615 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:53 crc kubenswrapper[4594]: I1129 05:28:53.911630 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:53Z","lastTransitionTime":"2025-11-29T05:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.013709 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.013752 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.013764 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.013782 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.013794 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:54Z","lastTransitionTime":"2025-11-29T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.083311 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.083359 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:54 crc kubenswrapper[4594]: E1129 05:28:54.083465 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.083493 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:54 crc kubenswrapper[4594]: E1129 05:28:54.083554 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:54 crc kubenswrapper[4594]: E1129 05:28:54.083648 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.115153 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.115178 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.115189 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.115201 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.115211 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:54Z","lastTransitionTime":"2025-11-29T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.217182 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.217217 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.217228 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.217242 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.217278 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:54Z","lastTransitionTime":"2025-11-29T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.319180 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.319216 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.319229 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.319244 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.319279 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:54Z","lastTransitionTime":"2025-11-29T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.421084 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.421114 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.421123 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.421134 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.421143 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:54Z","lastTransitionTime":"2025-11-29T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.522810 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.522847 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.522857 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.522886 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.522895 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:54Z","lastTransitionTime":"2025-11-29T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.624746 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.624788 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.624797 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.624827 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.624838 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:54Z","lastTransitionTime":"2025-11-29T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.726779 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.726858 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.726873 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.726894 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.726908 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:54Z","lastTransitionTime":"2025-11-29T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.828390 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.828420 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.828429 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.828440 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.828451 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:54Z","lastTransitionTime":"2025-11-29T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.930407 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.930471 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.930481 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.930496 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:54 crc kubenswrapper[4594]: I1129 05:28:54.930507 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:54Z","lastTransitionTime":"2025-11-29T05:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.032089 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.032141 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.032150 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.032161 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.032169 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:55Z","lastTransitionTime":"2025-11-29T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.082887 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:55 crc kubenswrapper[4594]: E1129 05:28:55.083007 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.133525 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.133595 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.133604 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.133613 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.133623 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:55Z","lastTransitionTime":"2025-11-29T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.235714 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.235785 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.235809 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.235836 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.235849 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:55Z","lastTransitionTime":"2025-11-29T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.339135 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.339181 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.339193 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.339209 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.339224 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:55Z","lastTransitionTime":"2025-11-29T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.442010 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.442049 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.442059 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.442074 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.442084 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:55Z","lastTransitionTime":"2025-11-29T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.544675 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.544730 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.544740 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.544763 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.544777 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:55Z","lastTransitionTime":"2025-11-29T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.646762 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.646803 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.646813 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.646825 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.646834 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:55Z","lastTransitionTime":"2025-11-29T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.748239 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.748291 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.748299 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.748314 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.748322 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:55Z","lastTransitionTime":"2025-11-29T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.850112 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.850148 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.850157 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.850170 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.850178 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:55Z","lastTransitionTime":"2025-11-29T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.952237 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.952310 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.952321 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.952344 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:55 crc kubenswrapper[4594]: I1129 05:28:55.952358 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:55Z","lastTransitionTime":"2025-11-29T05:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.053788 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.053839 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.053849 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.053862 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.053872 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:56Z","lastTransitionTime":"2025-11-29T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.082870 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.082928 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.083006 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:56 crc kubenswrapper[4594]: E1129 05:28:56.083123 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:56 crc kubenswrapper[4594]: E1129 05:28:56.083294 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:56 crc kubenswrapper[4594]: E1129 05:28:56.083510 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.095883 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cbcda3d-f02f-4e3e-9560-a1dcb0e9b659\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cdf0c76e877968d93b7b86fcdf319cfd2072ceb45903b275dfbe38db4a4e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7130e6d58c27093d1efa4f7260aba78681945df2d956da7878075ba5eb1d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe6aa22ca88502e90dc3ea058d420a497838102af8aa1afc78edc4a52bed8a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.104911 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.113909 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.128501 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:41Z\\\",\\\"message\\\":\\\"332 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-87h4n\\\\nI1129 05:28:41.803817 6332 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1129 05:28:41.803749 6332 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z]\\\\nI1129 05:28:41.803846 6332 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1129 05:28:41.803719 6332 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.139094 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.149234 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.155829 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.155859 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.155871 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.155885 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.155899 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:56Z","lastTransitionTime":"2025-11-29T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.160116 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.168339 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.175926 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.185906 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.194626 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.202038 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.209939 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.218238 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.227142 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.235371 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.242845 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.258248 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.258294 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.258306 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.258319 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.258329 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:56Z","lastTransitionTime":"2025-11-29T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.360199 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.360231 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.360241 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.360282 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.360293 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:56Z","lastTransitionTime":"2025-11-29T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.461973 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.462006 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.462016 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.462034 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.462044 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:56Z","lastTransitionTime":"2025-11-29T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.563723 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.563765 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.563779 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.563809 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.563823 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:56Z","lastTransitionTime":"2025-11-29T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.666268 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.666314 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.666326 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.666344 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.666356 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:56Z","lastTransitionTime":"2025-11-29T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.768022 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.768059 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.768070 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.768083 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.768095 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:56Z","lastTransitionTime":"2025-11-29T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.870077 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.870113 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.870124 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.870141 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.870150 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:56Z","lastTransitionTime":"2025-11-29T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.971657 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.971697 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.971710 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.971724 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:56 crc kubenswrapper[4594]: I1129 05:28:56.971736 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:56Z","lastTransitionTime":"2025-11-29T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.074248 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.074305 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.074318 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.074340 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.074355 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.082701 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:57 crc kubenswrapper[4594]: E1129 05:28:57.083118 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.083440 4594 scope.go:117] "RemoveContainer" containerID="54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a" Nov 29 05:28:57 crc kubenswrapper[4594]: E1129 05:28:57.083713 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.176592 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.176629 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.176642 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.176660 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.176671 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.279162 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.279193 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.279203 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.279218 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.279228 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.307467 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.307511 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.307521 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.307536 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.307546 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: E1129 05:28:57.320753 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:57Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.324292 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.324334 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.324346 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.324360 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.324368 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: E1129 05:28:57.337686 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:57Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.340690 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.340753 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.340766 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.340781 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.340803 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: E1129 05:28:57.350141 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:57Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.353247 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.353309 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.353322 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.353338 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.353349 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: E1129 05:28:57.362139 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:57Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.364654 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.364688 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.364701 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.364717 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.364726 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: E1129 05:28:57.373100 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:57Z is after 2025-08-24T17:21:41Z" Nov 29 05:28:57 crc kubenswrapper[4594]: E1129 05:28:57.373232 4594 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.381301 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.381334 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.381343 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.381356 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.381369 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.483125 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.483153 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.483164 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.483176 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.483185 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.585513 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.585561 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.585573 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.585594 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.585611 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.688496 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.688525 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.688534 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.688549 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.688558 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.790958 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.791047 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.791064 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.791082 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.791096 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.893020 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.893051 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.893062 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.893075 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.893083 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.994631 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.994668 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.994678 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.994690 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:57 crc kubenswrapper[4594]: I1129 05:28:57.994701 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:57Z","lastTransitionTime":"2025-11-29T05:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.085143 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.085182 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:28:58 crc kubenswrapper[4594]: E1129 05:28:58.085477 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.085220 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:28:58 crc kubenswrapper[4594]: E1129 05:28:58.085610 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:28:58 crc kubenswrapper[4594]: E1129 05:28:58.085738 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.097078 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.097110 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.097120 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.097134 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.097144 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:58Z","lastTransitionTime":"2025-11-29T05:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.199138 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.199162 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.199172 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.199184 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.199193 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:58Z","lastTransitionTime":"2025-11-29T05:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.300907 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.300936 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.300946 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.300958 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.300970 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:58Z","lastTransitionTime":"2025-11-29T05:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.402709 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.402737 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.402746 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.402758 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.402767 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:58Z","lastTransitionTime":"2025-11-29T05:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.504673 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.504726 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.504736 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.504747 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.504755 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:58Z","lastTransitionTime":"2025-11-29T05:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.606535 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.606575 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.606586 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.606601 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.606613 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:58Z","lastTransitionTime":"2025-11-29T05:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.708492 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.708516 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.708525 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.708534 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.708541 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:58Z","lastTransitionTime":"2025-11-29T05:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.810369 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.810403 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.810412 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.810425 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.810434 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:58Z","lastTransitionTime":"2025-11-29T05:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.911991 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.912025 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.912035 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.912048 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:58 crc kubenswrapper[4594]: I1129 05:28:58.912060 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:58Z","lastTransitionTime":"2025-11-29T05:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.014283 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.014324 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.014335 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.014353 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.014364 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:59Z","lastTransitionTime":"2025-11-29T05:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.083287 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:28:59 crc kubenswrapper[4594]: E1129 05:28:59.083535 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.116932 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.116982 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.116994 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.117020 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.117031 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:59Z","lastTransitionTime":"2025-11-29T05:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.219311 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.219357 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.219369 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.219389 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.219401 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:59Z","lastTransitionTime":"2025-11-29T05:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.322360 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.322411 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.322422 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.322440 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.322453 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:59Z","lastTransitionTime":"2025-11-29T05:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.425109 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.425148 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.425159 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.425175 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.425185 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:59Z","lastTransitionTime":"2025-11-29T05:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.527421 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.527454 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.527465 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.527478 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.527490 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:59Z","lastTransitionTime":"2025-11-29T05:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.630050 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.630087 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.630098 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.630113 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.630124 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:59Z","lastTransitionTime":"2025-11-29T05:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.731920 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.731956 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.731966 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.731976 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.731984 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:59Z","lastTransitionTime":"2025-11-29T05:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.834322 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.834359 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.834371 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.834388 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.834404 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:59Z","lastTransitionTime":"2025-11-29T05:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.936448 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.936483 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.936493 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.936505 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:28:59 crc kubenswrapper[4594]: I1129 05:28:59.936535 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:28:59Z","lastTransitionTime":"2025-11-29T05:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.038196 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.038231 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.038241 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.038275 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.038287 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:00Z","lastTransitionTime":"2025-11-29T05:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.083190 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:00 crc kubenswrapper[4594]: E1129 05:29:00.083301 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.083418 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.083451 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:00 crc kubenswrapper[4594]: E1129 05:29:00.083598 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:00 crc kubenswrapper[4594]: E1129 05:29:00.083740 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.139970 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.140010 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.140022 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.140043 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.140059 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:00Z","lastTransitionTime":"2025-11-29T05:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.241948 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.241978 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.241987 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.242000 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.242009 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:00Z","lastTransitionTime":"2025-11-29T05:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.344077 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.344206 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.344293 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.344365 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.344436 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:00Z","lastTransitionTime":"2025-11-29T05:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.446113 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.446136 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.446145 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.446159 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.446169 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:00Z","lastTransitionTime":"2025-11-29T05:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.547240 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.547329 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.547341 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.547354 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.547391 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:00Z","lastTransitionTime":"2025-11-29T05:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.649109 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.649137 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.649146 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.649156 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.649164 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:00Z","lastTransitionTime":"2025-11-29T05:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.750719 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.750762 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.750781 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.750795 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.750806 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:00Z","lastTransitionTime":"2025-11-29T05:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.852747 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.852786 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.852797 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.852809 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.852818 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:00Z","lastTransitionTime":"2025-11-29T05:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.954739 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.954880 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.955049 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.955122 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:00 crc kubenswrapper[4594]: I1129 05:29:00.955347 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:00Z","lastTransitionTime":"2025-11-29T05:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.057288 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.057316 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.057325 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.057338 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.057346 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:01Z","lastTransitionTime":"2025-11-29T05:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.082993 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:01 crc kubenswrapper[4594]: E1129 05:29:01.083240 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.159269 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.159302 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.159316 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.159329 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.159340 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:01Z","lastTransitionTime":"2025-11-29T05:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.262034 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.262280 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.262370 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.262464 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.262573 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:01Z","lastTransitionTime":"2025-11-29T05:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.364572 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.364619 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.364631 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.364646 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.364656 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:01Z","lastTransitionTime":"2025-11-29T05:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.465943 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.465967 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.465976 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.465989 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.465998 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:01Z","lastTransitionTime":"2025-11-29T05:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.567482 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.567883 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.567964 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.568031 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.568091 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:01Z","lastTransitionTime":"2025-11-29T05:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.670105 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.670144 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.670155 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.670170 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.670180 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:01Z","lastTransitionTime":"2025-11-29T05:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.771755 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.771952 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.772087 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.772168 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.772223 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:01Z","lastTransitionTime":"2025-11-29T05:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.874272 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.874307 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.874317 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.874333 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.874344 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:01Z","lastTransitionTime":"2025-11-29T05:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.977224 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.977267 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.977279 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.977295 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:01 crc kubenswrapper[4594]: I1129 05:29:01.977305 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:01Z","lastTransitionTime":"2025-11-29T05:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.080025 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.080170 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.080238 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.080326 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.080398 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:02Z","lastTransitionTime":"2025-11-29T05:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.083451 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:02 crc kubenswrapper[4594]: E1129 05:29:02.083598 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.083647 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.083665 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:02 crc kubenswrapper[4594]: E1129 05:29:02.083745 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:02 crc kubenswrapper[4594]: E1129 05:29:02.083866 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.182595 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.182621 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.182630 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.182686 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.182697 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:02Z","lastTransitionTime":"2025-11-29T05:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.284601 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.284735 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.284822 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.284880 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.284931 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:02Z","lastTransitionTime":"2025-11-29T05:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.386207 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.386240 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.386266 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.386279 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.386287 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:02Z","lastTransitionTime":"2025-11-29T05:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.487742 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.487790 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.487800 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.487818 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.487831 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:02Z","lastTransitionTime":"2025-11-29T05:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.589861 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.589905 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.589920 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.589942 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.589957 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:02Z","lastTransitionTime":"2025-11-29T05:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.692229 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.692283 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.692295 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.692313 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.692326 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:02Z","lastTransitionTime":"2025-11-29T05:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.793898 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.793932 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.793945 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.793959 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.793971 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:02Z","lastTransitionTime":"2025-11-29T05:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.857531 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:02 crc kubenswrapper[4594]: E1129 05:29:02.857646 4594 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:29:02 crc kubenswrapper[4594]: E1129 05:29:02.857714 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs podName:217088b9-a48b-40c7-8d83-f9ff0eb24908 nodeName:}" failed. No retries permitted until 2025-11-29 05:29:34.857697661 +0000 UTC m=+99.098206880 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs") pod "network-metrics-daemon-lzr56" (UID: "217088b9-a48b-40c7-8d83-f9ff0eb24908") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.896047 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.896086 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.896098 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.896111 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.896121 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:02Z","lastTransitionTime":"2025-11-29T05:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.998221 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.998284 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.998297 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.998309 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:02 crc kubenswrapper[4594]: I1129 05:29:02.998318 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:02Z","lastTransitionTime":"2025-11-29T05:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.083503 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:03 crc kubenswrapper[4594]: E1129 05:29:03.083626 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.099955 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.099990 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.100000 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.100014 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.100023 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:03Z","lastTransitionTime":"2025-11-29T05:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.201191 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.201228 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.201240 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.201318 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.201342 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:03Z","lastTransitionTime":"2025-11-29T05:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.303168 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.303191 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.303199 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.303209 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.303216 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:03Z","lastTransitionTime":"2025-11-29T05:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.404863 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.405013 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.405169 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.405348 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.405498 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:03Z","lastTransitionTime":"2025-11-29T05:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.507277 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.507393 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.507465 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.507536 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.507600 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:03Z","lastTransitionTime":"2025-11-29T05:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.609477 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.609537 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.609549 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.609571 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.609583 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:03Z","lastTransitionTime":"2025-11-29T05:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.711085 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.711117 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.711126 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.711136 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.711145 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:03Z","lastTransitionTime":"2025-11-29T05:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.812924 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.812957 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.812965 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.812977 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.812987 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:03Z","lastTransitionTime":"2025-11-29T05:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.914869 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.914907 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.914916 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.914933 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:03 crc kubenswrapper[4594]: I1129 05:29:03.914945 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:03Z","lastTransitionTime":"2025-11-29T05:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.016199 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.016226 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.016236 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.016247 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.016275 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:04Z","lastTransitionTime":"2025-11-29T05:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.082927 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.082951 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:04 crc kubenswrapper[4594]: E1129 05:29:04.083036 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.083060 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:04 crc kubenswrapper[4594]: E1129 05:29:04.083223 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:04 crc kubenswrapper[4594]: E1129 05:29:04.083278 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.118098 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.118133 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.118146 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.118160 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.118174 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:04Z","lastTransitionTime":"2025-11-29T05:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.219517 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.219554 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.219567 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.219583 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.219595 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:04Z","lastTransitionTime":"2025-11-29T05:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.324139 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.324180 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.324194 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.324208 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.324224 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:04Z","lastTransitionTime":"2025-11-29T05:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.426231 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.426275 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.426288 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.426301 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.426311 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:04Z","lastTransitionTime":"2025-11-29T05:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.528501 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.528528 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.528537 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.528550 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.528561 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:04Z","lastTransitionTime":"2025-11-29T05:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.630310 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.630397 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.630413 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.630428 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.630438 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:04Z","lastTransitionTime":"2025-11-29T05:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.732066 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.732135 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.732147 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.732165 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.732201 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:04Z","lastTransitionTime":"2025-11-29T05:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.833866 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.833898 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.833910 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.833923 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.833934 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:04Z","lastTransitionTime":"2025-11-29T05:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.935369 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.935397 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.935407 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.935419 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:04 crc kubenswrapper[4594]: I1129 05:29:04.935429 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:04Z","lastTransitionTime":"2025-11-29T05:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.037282 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.037340 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.037351 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.037362 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.037371 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:05Z","lastTransitionTime":"2025-11-29T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.082964 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:05 crc kubenswrapper[4594]: E1129 05:29:05.083076 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.138525 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.138562 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.138572 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.138584 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.138595 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:05Z","lastTransitionTime":"2025-11-29T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.240098 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.240124 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.240134 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.240146 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.240154 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:05Z","lastTransitionTime":"2025-11-29T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.341597 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.341618 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.341627 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.341639 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.341647 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:05Z","lastTransitionTime":"2025-11-29T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.420340 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7plzz_e5052790-d231-4f97-802c-c7de3cd72561/kube-multus/0.log" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.420381 4594 generic.go:334] "Generic (PLEG): container finished" podID="e5052790-d231-4f97-802c-c7de3cd72561" containerID="4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9" exitCode=1 Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.420407 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7plzz" event={"ID":"e5052790-d231-4f97-802c-c7de3cd72561","Type":"ContainerDied","Data":"4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9"} Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.420685 4594 scope.go:117] "RemoveContainer" containerID="4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.429548 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.437664 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.442851 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.442900 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.442912 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.442923 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.442933 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:05Z","lastTransitionTime":"2025-11-29T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.446889 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:29:04Z\\\",\\\"message\\\":\\\"2025-11-29T05:28:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9\\\\n2025-11-29T05:28:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9 to /host/opt/cni/bin/\\\\n2025-11-29T05:28:19Z [verbose] multus-daemon started\\\\n2025-11-29T05:28:19Z [verbose] Readiness Indicator file check\\\\n2025-11-29T05:29:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.455300 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.462783 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.472324 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.479912 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cbcda3d-f02f-4e3e-9560-a1dcb0e9b659\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cdf0c76e877968d93b7b86fcdf319cfd2072ceb45903b275dfbe38db4a4e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7130e6d58c27093d1efa4f7260aba78681945df2d956da7878075ba5eb1d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe6aa22ca88502e90dc3ea058d420a497838102af8aa1afc78edc4a52bed8a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.489739 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.498445 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.511860 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:41Z\\\",\\\"message\\\":\\\"332 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-87h4n\\\\nI1129 05:28:41.803817 6332 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1129 05:28:41.803749 6332 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z]\\\\nI1129 05:28:41.803846 6332 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1129 05:28:41.803719 6332 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.520245 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.528541 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.535504 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.542031 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.544925 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.544949 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.544958 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.544973 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.544984 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:05Z","lastTransitionTime":"2025-11-29T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.550359 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.557897 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.564317 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:05Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.646388 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.646413 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.646421 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.646432 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.646442 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:05Z","lastTransitionTime":"2025-11-29T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.748995 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.749027 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.749037 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.749052 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.749060 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:05Z","lastTransitionTime":"2025-11-29T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.850531 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.850578 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.850590 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.850603 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.850613 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:05Z","lastTransitionTime":"2025-11-29T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.952539 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.952785 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.952794 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.952804 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:05 crc kubenswrapper[4594]: I1129 05:29:05.952814 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:05Z","lastTransitionTime":"2025-11-29T05:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.054917 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.054949 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.054958 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.054971 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.054979 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:06Z","lastTransitionTime":"2025-11-29T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.083437 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.083454 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:06 crc kubenswrapper[4594]: E1129 05:29:06.083544 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.083554 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:06 crc kubenswrapper[4594]: E1129 05:29:06.083654 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:06 crc kubenswrapper[4594]: E1129 05:29:06.083804 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.092821 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.101438 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:29:04Z\\\",\\\"message\\\":\\\"2025-11-29T05:28:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9\\\\n2025-11-29T05:28:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9 to /host/opt/cni/bin/\\\\n2025-11-29T05:28:19Z [verbose] multus-daemon started\\\\n2025-11-29T05:28:19Z [verbose] Readiness Indicator file check\\\\n2025-11-29T05:29:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.109318 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.116754 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.124137 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.131748 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cbcda3d-f02f-4e3e-9560-a1dcb0e9b659\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cdf0c76e877968d93b7b86fcdf319cfd2072ceb45903b275dfbe38db4a4e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7130e6d58c27093d1efa4f7260aba78681945df2d956da7878075ba5eb1d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe6aa22ca88502e90dc3ea058d420a497838102af8aa1afc78edc4a52bed8a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.139563 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.150795 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.156022 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.156048 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.156058 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.156073 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.156084 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:06Z","lastTransitionTime":"2025-11-29T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.163236 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:41Z\\\",\\\"message\\\":\\\"332 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-87h4n\\\\nI1129 05:28:41.803817 6332 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1129 05:28:41.803749 6332 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z]\\\\nI1129 05:28:41.803846 6332 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1129 05:28:41.803719 6332 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.173249 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.182325 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.189446 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.195683 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.203395 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.211145 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.217470 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.225872 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.258044 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.258067 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.258077 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.258090 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.258098 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:06Z","lastTransitionTime":"2025-11-29T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.359361 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.359414 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.359431 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.359653 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.359691 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:06Z","lastTransitionTime":"2025-11-29T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.424543 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7plzz_e5052790-d231-4f97-802c-c7de3cd72561/kube-multus/0.log" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.424589 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7plzz" event={"ID":"e5052790-d231-4f97-802c-c7de3cd72561","Type":"ContainerStarted","Data":"abc4e229bc1c5698fec5a7e0da8decb0167770496e7f8dce7a9a9276721ee26c"} Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.435297 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.443830 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.451031 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.457549 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.461903 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.461926 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.461935 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.461946 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.461955 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:06Z","lastTransitionTime":"2025-11-29T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.466310 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.474277 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.481754 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.489105 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.496761 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.504655 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc4e229bc1c5698fec5a7e0da8decb0167770496e7f8dce7a9a9276721ee26c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:29:04Z\\\",\\\"message\\\":\\\"2025-11-29T05:28:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9\\\\n2025-11-29T05:28:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9 to /host/opt/cni/bin/\\\\n2025-11-29T05:28:19Z [verbose] multus-daemon started\\\\n2025-11-29T05:28:19Z [verbose] Readiness Indicator file check\\\\n2025-11-29T05:29:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:29:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.511929 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.519208 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.526301 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cbcda3d-f02f-4e3e-9560-a1dcb0e9b659\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cdf0c76e877968d93b7b86fcdf319cfd2072ceb45903b275dfbe38db4a4e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7130e6d58c27093d1efa4f7260aba78681945df2d956da7878075ba5eb1d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe6aa22ca88502e90dc3ea058d420a497838102af8aa1afc78edc4a52bed8a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.534034 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.541597 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.553251 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:41Z\\\",\\\"message\\\":\\\"332 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-87h4n\\\\nI1129 05:28:41.803817 6332 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1129 05:28:41.803749 6332 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z]\\\\nI1129 05:28:41.803846 6332 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1129 05:28:41.803719 6332 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.562633 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:06Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.563759 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.563790 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.563800 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.563824 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.563835 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:06Z","lastTransitionTime":"2025-11-29T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.665832 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.665867 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.665877 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.665891 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.665902 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:06Z","lastTransitionTime":"2025-11-29T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.768304 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.768330 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.768340 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.768351 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.768359 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:06Z","lastTransitionTime":"2025-11-29T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.869938 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.869958 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.869966 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.869975 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.869983 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:06Z","lastTransitionTime":"2025-11-29T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.972282 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.972305 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.972315 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.972325 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:06 crc kubenswrapper[4594]: I1129 05:29:06.972337 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:06Z","lastTransitionTime":"2025-11-29T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.073731 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.073761 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.073769 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.073780 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.073787 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.082961 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:07 crc kubenswrapper[4594]: E1129 05:29:07.083063 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.174773 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.174795 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.174804 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.174827 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.174835 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.277007 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.277033 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.277044 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.277056 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.277065 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.379456 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.379477 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.379487 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.379499 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.379509 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.481783 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.481835 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.481845 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.481856 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.481864 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.583623 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.583667 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.583682 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.583696 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.583704 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.684495 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.684527 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.684536 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.684550 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.684558 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: E1129 05:29:07.694440 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.697719 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.697766 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.697778 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.697790 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.697799 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: E1129 05:29:07.705681 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.708174 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.708218 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.708285 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.708301 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.708315 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: E1129 05:29:07.716564 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.718896 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.718950 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.718964 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.718974 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.718982 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: E1129 05:29:07.727036 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.729068 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.729101 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.729111 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.729126 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.729136 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: E1129 05:29:07.737773 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:07 crc kubenswrapper[4594]: E1129 05:29:07.737899 4594 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.739035 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.739067 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.739078 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.739091 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.739100 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.846227 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.846394 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.846617 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.846806 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.846984 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.949137 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.949241 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.949321 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.949387 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:07 crc kubenswrapper[4594]: I1129 05:29:07.949442 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:07Z","lastTransitionTime":"2025-11-29T05:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.051399 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.051449 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.051461 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.051475 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.051484 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:08Z","lastTransitionTime":"2025-11-29T05:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.082831 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.082920 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:08 crc kubenswrapper[4594]: E1129 05:29:08.083041 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.083068 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:08 crc kubenswrapper[4594]: E1129 05:29:08.083133 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:08 crc kubenswrapper[4594]: E1129 05:29:08.083218 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.153567 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.153641 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.153653 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.153671 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.153685 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:08Z","lastTransitionTime":"2025-11-29T05:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.255335 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.255368 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.255377 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.255389 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.255398 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:08Z","lastTransitionTime":"2025-11-29T05:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.357682 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.357705 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.357716 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.357728 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.357737 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:08Z","lastTransitionTime":"2025-11-29T05:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.459831 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.459866 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.459878 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.459890 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.459900 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:08Z","lastTransitionTime":"2025-11-29T05:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.561576 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.561611 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.561621 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.561634 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.561646 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:08Z","lastTransitionTime":"2025-11-29T05:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.663990 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.664016 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.664026 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.664039 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.664048 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:08Z","lastTransitionTime":"2025-11-29T05:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.765407 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.765439 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.765450 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.765460 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.765468 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:08Z","lastTransitionTime":"2025-11-29T05:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.867838 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.867879 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.867891 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.867908 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.867922 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:08Z","lastTransitionTime":"2025-11-29T05:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.969701 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.969732 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.969744 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.969755 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:08 crc kubenswrapper[4594]: I1129 05:29:08.969767 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:08Z","lastTransitionTime":"2025-11-29T05:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.071291 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.071322 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.071333 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.071344 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.071354 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:09Z","lastTransitionTime":"2025-11-29T05:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.082851 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:09 crc kubenswrapper[4594]: E1129 05:29:09.083006 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.084865 4594 scope.go:117] "RemoveContainer" containerID="54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.173918 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.173949 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.173961 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.173974 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.173983 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:09Z","lastTransitionTime":"2025-11-29T05:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.275741 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.275785 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.275795 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.275808 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.275828 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:09Z","lastTransitionTime":"2025-11-29T05:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.378557 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.378609 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.378618 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.378643 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.378656 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:09Z","lastTransitionTime":"2025-11-29T05:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.433873 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/2.log" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.436120 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerStarted","Data":"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85"} Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.436458 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.446092 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.453395 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.462825 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.471378 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc4e229bc1c5698fec5a7e0da8decb0167770496e7f8dce7a9a9276721ee26c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:29:04Z\\\",\\\"message\\\":\\\"2025-11-29T05:28:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9\\\\n2025-11-29T05:28:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9 to /host/opt/cni/bin/\\\\n2025-11-29T05:28:19Z [verbose] multus-daemon started\\\\n2025-11-29T05:28:19Z [verbose] Readiness Indicator file check\\\\n2025-11-29T05:29:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:29:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.478804 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.480150 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.480183 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.480192 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.480207 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.480217 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:09Z","lastTransitionTime":"2025-11-29T05:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.485373 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.493283 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.502719 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.511055 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.519340 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.535454 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:41Z\\\",\\\"message\\\":\\\"332 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-87h4n\\\\nI1129 05:28:41.803817 6332 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1129 05:28:41.803749 6332 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z]\\\\nI1129 05:28:41.803846 6332 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1129 05:28:41.803719 6332 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:29:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.545617 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.552892 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cbcda3d-f02f-4e3e-9560-a1dcb0e9b659\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cdf0c76e877968d93b7b86fcdf319cfd2072ceb45903b275dfbe38db4a4e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7130e6d58c27093d1efa4f7260aba78681945df2d956da7878075ba5eb1d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe6aa22ca88502e90dc3ea058d420a497838102af8aa1afc78edc4a52bed8a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.560043 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.566668 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.574493 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.582578 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.582609 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.582618 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.582635 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.582645 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:09Z","lastTransitionTime":"2025-11-29T05:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.582924 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.684864 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.684903 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.684913 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.684928 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.684937 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:09Z","lastTransitionTime":"2025-11-29T05:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.786989 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.787014 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.787023 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.787048 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.787057 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:09Z","lastTransitionTime":"2025-11-29T05:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.889010 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.889041 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.889049 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.889062 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.889072 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:09Z","lastTransitionTime":"2025-11-29T05:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.991250 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.991302 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.991311 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.991324 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:09 crc kubenswrapper[4594]: I1129 05:29:09.991333 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:09Z","lastTransitionTime":"2025-11-29T05:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.082645 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.082674 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.082646 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:10 crc kubenswrapper[4594]: E1129 05:29:10.082757 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:10 crc kubenswrapper[4594]: E1129 05:29:10.082862 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:10 crc kubenswrapper[4594]: E1129 05:29:10.082990 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.093099 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.093129 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.093137 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.093147 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.093154 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:10Z","lastTransitionTime":"2025-11-29T05:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.194312 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.194352 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.194363 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.194373 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.194380 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:10Z","lastTransitionTime":"2025-11-29T05:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.296416 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.296444 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.296454 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.296465 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.296473 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:10Z","lastTransitionTime":"2025-11-29T05:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.398376 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.398405 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.398413 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.398423 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.398431 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:10Z","lastTransitionTime":"2025-11-29T05:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.439953 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/3.log" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.440432 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/2.log" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.442548 4594 generic.go:334] "Generic (PLEG): container finished" podID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerID="1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85" exitCode=1 Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.442578 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85"} Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.442605 4594 scope.go:117] "RemoveContainer" containerID="54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.443097 4594 scope.go:117] "RemoveContainer" containerID="1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85" Nov 29 05:29:10 crc kubenswrapper[4594]: E1129 05:29:10.443240 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.453066 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.461915 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.468988 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.476665 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.484139 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.491887 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc4e229bc1c5698fec5a7e0da8decb0167770496e7f8dce7a9a9276721ee26c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:29:04Z\\\",\\\"message\\\":\\\"2025-11-29T05:28:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9\\\\n2025-11-29T05:28:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9 to /host/opt/cni/bin/\\\\n2025-11-29T05:28:19Z [verbose] multus-daemon started\\\\n2025-11-29T05:28:19Z [verbose] Readiness Indicator file check\\\\n2025-11-29T05:29:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:29:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.499099 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.500109 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.500139 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.500148 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.500162 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.500172 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:10Z","lastTransitionTime":"2025-11-29T05:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.506432 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.513788 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cbcda3d-f02f-4e3e-9560-a1dcb0e9b659\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cdf0c76e877968d93b7b86fcdf319cfd2072ceb45903b275dfbe38db4a4e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7130e6d58c27093d1efa4f7260aba78681945df2d956da7878075ba5eb1d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe6aa22ca88502e90dc3ea058d420a497838102af8aa1afc78edc4a52bed8a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.523866 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.531980 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.543802 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ed3d74941dcb3f33ee5231512aac9b31eaf9c55171df9201d2bd6da265954a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:28:41Z\\\",\\\"message\\\":\\\"332 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-87h4n\\\\nI1129 05:28:41.803817 6332 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF1129 05:28:41.803749 6332 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:28:41Z is after 2025-08-24T17:21:41Z]\\\\nI1129 05:28:41.803846 6332 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-target for network=default\\\\nI1129 05:28:41.803719 6332 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:29:09Z\\\",\\\"message\\\":\\\"nsuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz in node crc\\\\nI1129 05:29:09.714087 6708 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI1129 05:29:09.714092 6708 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1129 05:29:09.714099 6708 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1129 05:29:09.714102 6708 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1129 05:29:09.714109 6708 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-lzr56\\\\nI1129 05:29:09.714113 6708 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-lzr56\\\\nI1129 05:29:09.714118 6708 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-lzr56 in node crc\\\\nI1129 05:29:09.714137 6708 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-lzr56] creating logical port openshift-multus_network-metrics-daemon-lzr56 for pod on switch crc\\\\nF1129 05:29:09.714159 6708 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:29:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.553018 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.561302 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.570442 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.578775 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.585595 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.602060 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.602092 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.602102 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.602118 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.602128 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:10Z","lastTransitionTime":"2025-11-29T05:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.706221 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.706297 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.706308 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.706320 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.706332 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:10Z","lastTransitionTime":"2025-11-29T05:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.808201 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.808237 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.808246 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.808279 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.808290 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:10Z","lastTransitionTime":"2025-11-29T05:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.909511 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.909544 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.909553 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.909562 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:10 crc kubenswrapper[4594]: I1129 05:29:10.909570 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:10Z","lastTransitionTime":"2025-11-29T05:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.011668 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.011698 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.011707 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.011719 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.011727 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:11Z","lastTransitionTime":"2025-11-29T05:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.082985 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:11 crc kubenswrapper[4594]: E1129 05:29:11.083085 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.113616 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.113647 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.113656 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.113666 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.113674 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:11Z","lastTransitionTime":"2025-11-29T05:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.215328 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.215359 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.215367 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.215378 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.215385 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:11Z","lastTransitionTime":"2025-11-29T05:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.317216 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.317245 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.317271 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.317282 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.317289 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:11Z","lastTransitionTime":"2025-11-29T05:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.418882 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.418906 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.418915 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.418925 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.418932 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:11Z","lastTransitionTime":"2025-11-29T05:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.446851 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/3.log" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.449722 4594 scope.go:117] "RemoveContainer" containerID="1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85" Nov 29 05:29:11 crc kubenswrapper[4594]: E1129 05:29:11.449910 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.459089 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.468187 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc4e229bc1c5698fec5a7e0da8decb0167770496e7f8dce7a9a9276721ee26c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:29:04Z\\\",\\\"message\\\":\\\"2025-11-29T05:28:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9\\\\n2025-11-29T05:28:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9 to /host/opt/cni/bin/\\\\n2025-11-29T05:28:19Z [verbose] multus-daemon started\\\\n2025-11-29T05:28:19Z [verbose] Readiness Indicator file check\\\\n2025-11-29T05:29:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:29:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.475480 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.482278 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.489597 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.497461 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cbcda3d-f02f-4e3e-9560-a1dcb0e9b659\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cdf0c76e877968d93b7b86fcdf319cfd2072ceb45903b275dfbe38db4a4e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7130e6d58c27093d1efa4f7260aba78681945df2d956da7878075ba5eb1d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe6aa22ca88502e90dc3ea058d420a497838102af8aa1afc78edc4a52bed8a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.505373 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.512928 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.520678 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.520706 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.520717 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.520729 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.520738 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:11Z","lastTransitionTime":"2025-11-29T05:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.525311 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:29:09Z\\\",\\\"message\\\":\\\"nsuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz in node crc\\\\nI1129 05:29:09.714087 6708 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI1129 05:29:09.714092 6708 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1129 05:29:09.714099 6708 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1129 05:29:09.714102 6708 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1129 05:29:09.714109 6708 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-lzr56\\\\nI1129 05:29:09.714113 6708 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-lzr56\\\\nI1129 05:29:09.714118 6708 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-lzr56 in node crc\\\\nI1129 05:29:09.714137 6708 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-lzr56] creating logical port openshift-multus_network-metrics-daemon-lzr56 for pod on switch crc\\\\nF1129 05:29:09.714159 6708 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:29:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.534644 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.543085 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.550088 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.556598 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.564550 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.572397 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.579047 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.587552 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:11Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.622814 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.622872 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.622881 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.622898 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.622910 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:11Z","lastTransitionTime":"2025-11-29T05:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.724580 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.724611 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.724620 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.724630 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.724640 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:11Z","lastTransitionTime":"2025-11-29T05:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.826413 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.826465 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.826505 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.826543 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.826558 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:11Z","lastTransitionTime":"2025-11-29T05:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.927846 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.927880 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.927890 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.927903 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:11 crc kubenswrapper[4594]: I1129 05:29:11.927911 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:11Z","lastTransitionTime":"2025-11-29T05:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.030270 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.030303 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.030313 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.030324 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.030332 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:12Z","lastTransitionTime":"2025-11-29T05:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.083307 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.083355 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:12 crc kubenswrapper[4594]: E1129 05:29:12.083402 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.083319 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:12 crc kubenswrapper[4594]: E1129 05:29:12.083501 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:12 crc kubenswrapper[4594]: E1129 05:29:12.083581 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.131689 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.131730 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.131743 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.131758 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.131767 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:12Z","lastTransitionTime":"2025-11-29T05:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.233924 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.233956 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.233965 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.233977 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.233985 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:12Z","lastTransitionTime":"2025-11-29T05:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.335577 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.335601 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.335609 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.335621 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.335632 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:12Z","lastTransitionTime":"2025-11-29T05:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.436745 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.436770 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.436812 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.436854 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.436866 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:12Z","lastTransitionTime":"2025-11-29T05:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.539029 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.539050 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.539067 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.539079 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.539086 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:12Z","lastTransitionTime":"2025-11-29T05:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.641048 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.641074 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.641083 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.641095 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.641103 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:12Z","lastTransitionTime":"2025-11-29T05:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.743241 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.743289 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.743299 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.743310 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.743319 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:12Z","lastTransitionTime":"2025-11-29T05:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.845055 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.845084 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.845093 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.845105 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.845113 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:12Z","lastTransitionTime":"2025-11-29T05:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.946172 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.946201 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.946211 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.946323 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:12 crc kubenswrapper[4594]: I1129 05:29:12.946332 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:12Z","lastTransitionTime":"2025-11-29T05:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.048042 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.048077 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.048087 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.048100 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.048108 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:13Z","lastTransitionTime":"2025-11-29T05:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.083386 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:13 crc kubenswrapper[4594]: E1129 05:29:13.083480 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.149233 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.149289 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.149298 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.149307 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.149315 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:13Z","lastTransitionTime":"2025-11-29T05:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.251902 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.251936 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.251944 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.251954 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.251962 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:13Z","lastTransitionTime":"2025-11-29T05:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.353459 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.353493 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.353503 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.353515 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.353523 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:13Z","lastTransitionTime":"2025-11-29T05:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.455163 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.455196 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.455207 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.455220 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.455231 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:13Z","lastTransitionTime":"2025-11-29T05:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.557424 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.557481 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.557491 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.557505 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.557515 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:13Z","lastTransitionTime":"2025-11-29T05:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.659309 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.659341 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.659350 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.659362 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.659372 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:13Z","lastTransitionTime":"2025-11-29T05:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.761289 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.761319 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.761329 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.761338 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.761345 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:13Z","lastTransitionTime":"2025-11-29T05:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.863490 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.863516 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.863524 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.863535 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.863543 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:13Z","lastTransitionTime":"2025-11-29T05:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.964800 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.964822 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.964838 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.964847 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:13 crc kubenswrapper[4594]: I1129 05:29:13.964854 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:13Z","lastTransitionTime":"2025-11-29T05:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.066850 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.066880 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.066889 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.066899 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.066907 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:14Z","lastTransitionTime":"2025-11-29T05:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.083428 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.083457 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:14 crc kubenswrapper[4594]: E1129 05:29:14.083571 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.083587 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:14 crc kubenswrapper[4594]: E1129 05:29:14.083694 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:14 crc kubenswrapper[4594]: E1129 05:29:14.083775 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.168245 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.168314 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.168325 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.168339 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.168349 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:14Z","lastTransitionTime":"2025-11-29T05:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.270293 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.270327 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.270335 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.270347 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.270358 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:14Z","lastTransitionTime":"2025-11-29T05:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.372402 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.372433 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.372444 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.372455 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.372463 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:14Z","lastTransitionTime":"2025-11-29T05:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.474439 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.474496 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.474505 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.474520 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.474552 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:14Z","lastTransitionTime":"2025-11-29T05:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.576858 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.576890 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.576899 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.576909 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.576917 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:14Z","lastTransitionTime":"2025-11-29T05:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.678974 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.679010 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.679019 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.679031 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.679040 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:14Z","lastTransitionTime":"2025-11-29T05:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.781031 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.781131 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.781194 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.781282 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.781353 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:14Z","lastTransitionTime":"2025-11-29T05:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.882517 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.882542 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.882550 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.882559 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.882567 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:14Z","lastTransitionTime":"2025-11-29T05:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.983662 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.983685 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.983691 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.983701 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:14 crc kubenswrapper[4594]: I1129 05:29:14.983707 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:14Z","lastTransitionTime":"2025-11-29T05:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.082798 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:15 crc kubenswrapper[4594]: E1129 05:29:15.082907 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.084804 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.084838 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.084848 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.084860 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.084869 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:15Z","lastTransitionTime":"2025-11-29T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.186079 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.186111 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.186121 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.186133 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.186144 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:15Z","lastTransitionTime":"2025-11-29T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.291762 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.291799 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.291809 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.291820 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.291837 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:15Z","lastTransitionTime":"2025-11-29T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.393749 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.393798 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.393808 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.393820 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.393839 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:15Z","lastTransitionTime":"2025-11-29T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.495367 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.495397 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.495427 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.495438 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.495446 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:15Z","lastTransitionTime":"2025-11-29T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.597431 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.597469 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.597480 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.597489 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.597498 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:15Z","lastTransitionTime":"2025-11-29T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.699451 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.699586 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.699650 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.699722 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.699803 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:15Z","lastTransitionTime":"2025-11-29T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.801495 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.801554 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.801564 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.801588 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.801603 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:15Z","lastTransitionTime":"2025-11-29T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.903791 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.904181 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.904296 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.904394 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:15 crc kubenswrapper[4594]: I1129 05:29:15.904485 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:15Z","lastTransitionTime":"2025-11-29T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.006156 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.006187 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.006197 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.006210 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.006221 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:16Z","lastTransitionTime":"2025-11-29T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.083348 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.083502 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:16 crc kubenswrapper[4594]: E1129 05:29:16.083588 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.083535 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:16 crc kubenswrapper[4594]: E1129 05:29:16.083708 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:16 crc kubenswrapper[4594]: E1129 05:29:16.083934 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.092548 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.095348 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba18db1b4391204c0fe395488d2726d914dc28c88b9cf449d83c9c181a7d04a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.102856 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509844d4-3ee1-4059-84bb-6e90200f50c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5e396522be1cb7984f98fd416f9a6aeda1e6009e002e9c942a6cec0d4a4c22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4qlc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ggz4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.108489 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.108528 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.108540 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.108552 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.108560 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:16Z","lastTransitionTime":"2025-11-29T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.109908 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-26zjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77af4f9f-196d-436d-9fb6-69168bcb5f8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70467de0fada47ce38faab4cc04c54942aa5f7c0cacf29b1d21b64a11e6162c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxrks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-26zjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.119015 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be916e6-7f0f-45ae-8fe5-85d2c7e459f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf23cd4a300a49fabdf8fcaacfda760563c8e53cf70de7a4c07be717fd76b660\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b416f8ad36a09f58e68b1e23784b344534ccfb013a3272c0754602431a3a3737\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://261ab2af67dd877cead51f204629b59233865717d4805c3d4d80c1a5d51e2021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.128156 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84b578286ef0da9c863bedb4ebec224e34007c7f59cead26dac786d7e4b5e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1975ee2ce0af02676f9925944a31b218cb17cdf4dfb85c6edbfd124ac58fd22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.134954 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-87h4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080d2d67-a474-4974-94f3-81c5007e0a57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b58d5af901448b3e10a9a902daa31aecebe095bdae0e620c991323b99c52b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdnbj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-87h4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.144133 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1629901e-7133-4eb0-b634-5e458da9f205\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T05:28:13Z\\\",\\\"message\\\":\\\"le observer\\\\nW1129 05:28:13.316789 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 05:28:13.316951 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 05:28:13.320123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1464513827/tls.crt::/tmp/serving-cert-1464513827/tls.key\\\\\\\"\\\\nI1129 05:28:13.540439 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 05:28:13.545990 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 05:28:13.546014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 05:28:13.546050 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 05:28:13.546055 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 05:28:13.551788 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1129 05:28:13.551806 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 05:28:13.551814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 05:28:13.551817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 05:28:13.551820 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 05:28:13.551823 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1129 05:28:13.551915 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1129 05:28:13.553356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.154106 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.163176 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7plzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5052790-d231-4f97-802c-c7de3cd72561\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc4e229bc1c5698fec5a7e0da8decb0167770496e7f8dce7a9a9276721ee26c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:29:04Z\\\",\\\"message\\\":\\\"2025-11-29T05:28:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9\\\\n2025-11-29T05:28:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c08f590a-ec87-48f6-839c-a6dd639e3ff9 to /host/opt/cni/bin/\\\\n2025-11-29T05:28:19Z [verbose] multus-daemon started\\\\n2025-11-29T05:28:19Z [verbose] Readiness Indicator file check\\\\n2025-11-29T05:29:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:29:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qglsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7plzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.172643 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"240291e8-d7ec-4fc7-919f-e082a6890e2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1142d4b616972d87b8ac7b83cacdc37c3fd02df7cb4adfcabbaaaa18c5d2b8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af579fe3e5cce8fdb2dff72f206f5504582b8eee906db5180191ef91b990c0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x25zr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vn6fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.179726 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lzr56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"217088b9-a48b-40c7-8d83-f9ff0eb24908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8gwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lzr56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.187726 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867c23355f828dc4cc15d632f90c52174e9b9be07f3ec24f05c904709620de6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.196013 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cbcda3d-f02f-4e3e-9560-a1dcb0e9b659\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cdf0c76e877968d93b7b86fcdf319cfd2072ceb45903b275dfbe38db4a4e92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7130e6d58c27093d1efa4f7260aba78681945df2d956da7878075ba5eb1d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe6aa22ca88502e90dc3ea058d420a497838102af8aa1afc78edc4a52bed8a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72ff5cb72ae536180cf8ebbc6af9c3a01bc453e5d917c8c7060fbd987de73c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:27:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:27:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:27:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.204708 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.214905 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.215081 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.215105 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.215114 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.215135 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.215146 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:16Z","lastTransitionTime":"2025-11-29T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.226681 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08de5891-72ca-488c-80b3-6b54c8c1a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T05:29:09Z\\\",\\\"message\\\":\\\"nsuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz in node crc\\\\nI1129 05:29:09.714087 6708 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI1129 05:29:09.714092 6708 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1129 05:29:09.714099 6708 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1129 05:29:09.714102 6708 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1129 05:29:09.714109 6708 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-lzr56\\\\nI1129 05:29:09.714113 6708 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-lzr56\\\\nI1129 05:29:09.714118 6708 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-lzr56 in node crc\\\\nI1129 05:29:09.714137 6708 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-lzr56] creating logical port openshift-multus_network-metrics-daemon-lzr56 for pod on switch crc\\\\nF1129 05:29:09.714159 6708 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T05:29:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjws8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lp4zm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.236445 4594 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad2c26bf-c1c8-44b5-b4ed-d487072d358b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T05:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9db8528eef5425d9af2b8f11f244556f195b844e239a2137eef24c40fa158d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T05:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9e8c27b9b7a4a13a1f6ae5461acd535cc6784f4158213697307e4061f1e222\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa32ebafd1f717a0d3ba44387b7b36827e9e12589812bdc33b1dfb84bd41772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1085c6e05fe233979922ace4946592ed4a7afe00a0f1c3a256c41a0964d26a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://819e8c7a8ea3f0f7a082da904dd9cc01b5d7e7e6520cafe67920a190063947ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1602d1ea5e426042c520a4f98ed23530fc8f94f67f242a639c32a29817495d73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81866e9b3e8415fc1e5af0eec4a7222deab6b282b1758dae13759a073192a554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T05:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T05:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnd6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T05:28:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zwqcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:16Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.316670 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.316700 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.316710 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.316723 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.316733 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:16Z","lastTransitionTime":"2025-11-29T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.418972 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.419006 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.419016 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.419029 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.419039 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:16Z","lastTransitionTime":"2025-11-29T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.521214 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.521342 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.521416 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.521501 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.521556 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:16Z","lastTransitionTime":"2025-11-29T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.623116 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.623144 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.623153 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.623167 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.623195 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:16Z","lastTransitionTime":"2025-11-29T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.724512 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.724552 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.724561 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.724579 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.724589 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:16Z","lastTransitionTime":"2025-11-29T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.826651 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.826843 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.826924 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.826991 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.827053 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:16Z","lastTransitionTime":"2025-11-29T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.928554 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.928892 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.928974 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.929087 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:16 crc kubenswrapper[4594]: I1129 05:29:16.929141 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:16Z","lastTransitionTime":"2025-11-29T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.030759 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.030926 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.030992 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.031063 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.031136 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:17Z","lastTransitionTime":"2025-11-29T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.082899 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:17 crc kubenswrapper[4594]: E1129 05:29:17.083023 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.132585 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.132617 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.132627 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.132639 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.132648 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:17Z","lastTransitionTime":"2025-11-29T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.234823 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.234867 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.234878 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.234891 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.234901 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:17Z","lastTransitionTime":"2025-11-29T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.336920 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.337051 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.337131 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.337244 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.337368 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:17Z","lastTransitionTime":"2025-11-29T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.438648 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.438802 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.438877 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.438948 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.439005 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:17Z","lastTransitionTime":"2025-11-29T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.540069 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.540103 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.540113 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.540127 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.540137 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:17Z","lastTransitionTime":"2025-11-29T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.641881 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.641923 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.641936 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.641952 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.641964 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:17Z","lastTransitionTime":"2025-11-29T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.743767 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.743800 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.743809 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.743821 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.743830 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:17Z","lastTransitionTime":"2025-11-29T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.845704 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.845736 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.845744 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.845758 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.845768 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:17Z","lastTransitionTime":"2025-11-29T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.947938 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.947969 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.947977 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.947989 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.947998 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:17Z","lastTransitionTime":"2025-11-29T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.980319 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.980373 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:17 crc kubenswrapper[4594]: I1129 05:29:17.980398 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:17 crc kubenswrapper[4594]: E1129 05:29:17.980511 4594 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:29:17 crc kubenswrapper[4594]: E1129 05:29:17.980531 4594 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:29:17 crc kubenswrapper[4594]: E1129 05:29:17.980561 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:21.980533238 +0000 UTC m=+146.221042488 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:29:17 crc kubenswrapper[4594]: E1129 05:29:17.980603 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:30:21.98058754 +0000 UTC m=+146.221096790 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 05:29:17 crc kubenswrapper[4594]: E1129 05:29:17.980625 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 05:30:21.980614701 +0000 UTC m=+146.221123941 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.050206 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.050323 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.050397 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.050463 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.050526 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.081061 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.081119 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.081225 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.081240 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.081277 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.081289 4594 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.081321 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 05:30:22.081310499 +0000 UTC m=+146.321819720 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.081244 4594 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.081356 4594 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.081383 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 05:30:22.081375331 +0000 UTC m=+146.321884551 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.083434 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.083518 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.083604 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.083664 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.083742 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.083880 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.114648 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.114827 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.114845 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.114855 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.114863 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.128346 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.131559 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.131581 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.131589 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.131603 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.131613 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.155697 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.158765 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.158904 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.158979 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.159048 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.159114 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.169529 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.172810 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.172915 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.172984 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.173045 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.173109 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.182036 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.184287 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.184386 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.184459 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.184520 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.184582 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.196646 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T05:29:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"278d67da-c900-4140-8f95-1590a0940f46\\\",\\\"systemUUID\\\":\\\"64455698-ce5a-4229-a093-dc9c12114354\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T05:29:18Z is after 2025-08-24T17:21:41Z" Nov 29 05:29:18 crc kubenswrapper[4594]: E1129 05:29:18.196758 4594 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.197908 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.197939 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.197950 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.197968 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.197979 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.300339 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.300447 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.300536 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.300604 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.300660 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.402734 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.402775 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.402785 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.402802 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.402813 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.504638 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.504744 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.504822 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.504907 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.504963 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.606801 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.606911 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.606978 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.607051 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.607110 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.709107 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.709145 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.709156 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.709176 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.709188 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.811108 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.811133 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.811147 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.811158 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.811168 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.913332 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.913387 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.913401 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.913420 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:18 crc kubenswrapper[4594]: I1129 05:29:18.913431 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:18Z","lastTransitionTime":"2025-11-29T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.015118 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.015179 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.015189 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.015200 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.015207 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:19Z","lastTransitionTime":"2025-11-29T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.082633 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:19 crc kubenswrapper[4594]: E1129 05:29:19.082941 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.096341 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.117496 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.117524 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.117551 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.117563 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.117570 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:19Z","lastTransitionTime":"2025-11-29T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.220082 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.220279 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.220362 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.220434 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.220491 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:19Z","lastTransitionTime":"2025-11-29T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.322380 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.322575 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.322653 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.322718 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.322778 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:19Z","lastTransitionTime":"2025-11-29T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.424348 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.424525 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.424596 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.424666 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.424727 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:19Z","lastTransitionTime":"2025-11-29T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.526217 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.526442 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.526507 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.526567 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.526619 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:19Z","lastTransitionTime":"2025-11-29T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.628560 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.628609 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.628623 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.628645 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.628657 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:19Z","lastTransitionTime":"2025-11-29T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.730360 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.730395 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.730406 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.730422 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.730433 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:19Z","lastTransitionTime":"2025-11-29T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.832394 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.832427 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.832437 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.832450 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.832458 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:19Z","lastTransitionTime":"2025-11-29T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.934017 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.934054 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.934064 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.934080 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:19 crc kubenswrapper[4594]: I1129 05:29:19.934091 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:19Z","lastTransitionTime":"2025-11-29T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.036279 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.036319 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.036330 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.036342 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.036357 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:20Z","lastTransitionTime":"2025-11-29T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.083236 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.083274 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.083301 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:20 crc kubenswrapper[4594]: E1129 05:29:20.083358 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:20 crc kubenswrapper[4594]: E1129 05:29:20.083421 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:20 crc kubenswrapper[4594]: E1129 05:29:20.083487 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.138807 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.138861 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.138870 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.138885 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.138894 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:20Z","lastTransitionTime":"2025-11-29T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.240807 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.240849 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.240860 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.240874 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.240882 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:20Z","lastTransitionTime":"2025-11-29T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.343771 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.343805 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.343814 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.343830 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.343839 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:20Z","lastTransitionTime":"2025-11-29T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.445628 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.445663 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.445711 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.445730 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.445741 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:20Z","lastTransitionTime":"2025-11-29T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.547031 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.547055 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.547063 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.547073 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.547080 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:20Z","lastTransitionTime":"2025-11-29T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.648483 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.648522 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.648532 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.648546 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.648554 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:20Z","lastTransitionTime":"2025-11-29T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.751473 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.751533 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.751542 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.751555 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.751564 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:20Z","lastTransitionTime":"2025-11-29T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.854185 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.854211 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.854219 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.854231 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.854239 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:20Z","lastTransitionTime":"2025-11-29T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.956061 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.956099 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.956122 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.956136 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:20 crc kubenswrapper[4594]: I1129 05:29:20.956146 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:20Z","lastTransitionTime":"2025-11-29T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.057783 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.057816 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.057826 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.057841 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.057862 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:21Z","lastTransitionTime":"2025-11-29T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.083216 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:21 crc kubenswrapper[4594]: E1129 05:29:21.083345 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.159855 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.159883 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.159893 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.159902 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.159911 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:21Z","lastTransitionTime":"2025-11-29T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.261658 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.261691 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.261701 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.261715 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.261726 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:21Z","lastTransitionTime":"2025-11-29T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.363350 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.363383 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.363392 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.363407 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.363416 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:21Z","lastTransitionTime":"2025-11-29T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.465289 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.465313 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.465320 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.465332 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.465340 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:21Z","lastTransitionTime":"2025-11-29T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.567412 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.567441 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.567451 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.567481 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.567488 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:21Z","lastTransitionTime":"2025-11-29T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.669099 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.669125 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.669134 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.669145 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.669154 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:21Z","lastTransitionTime":"2025-11-29T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.770476 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.770506 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.770516 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.770544 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.770551 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:21Z","lastTransitionTime":"2025-11-29T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.872069 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.872106 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.872113 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.872126 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.872157 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:21Z","lastTransitionTime":"2025-11-29T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.973905 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.973939 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.973948 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.973957 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:21 crc kubenswrapper[4594]: I1129 05:29:21.973964 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:21Z","lastTransitionTime":"2025-11-29T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.076062 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.076092 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.076101 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.076112 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.076119 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:22Z","lastTransitionTime":"2025-11-29T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.083380 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:22 crc kubenswrapper[4594]: E1129 05:29:22.083461 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.083490 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.083573 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:22 crc kubenswrapper[4594]: E1129 05:29:22.083838 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:22 crc kubenswrapper[4594]: E1129 05:29:22.084146 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.177980 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.178009 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.178017 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.178027 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.178034 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:22Z","lastTransitionTime":"2025-11-29T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.280000 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.280101 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.280167 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.280236 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.280326 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:22Z","lastTransitionTime":"2025-11-29T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.381975 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.382009 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.382020 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.382032 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.382041 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:22Z","lastTransitionTime":"2025-11-29T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.484011 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.484039 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.484049 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.484059 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.484067 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:22Z","lastTransitionTime":"2025-11-29T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.585832 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.585873 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.585884 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.585897 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.585906 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:22Z","lastTransitionTime":"2025-11-29T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.688247 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.688633 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.688714 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.688787 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.688858 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:22Z","lastTransitionTime":"2025-11-29T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.790810 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.790920 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.791009 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.791089 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.791150 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:22Z","lastTransitionTime":"2025-11-29T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.892655 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.892692 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.892710 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.892723 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.892732 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:22Z","lastTransitionTime":"2025-11-29T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.993974 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.994011 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.994022 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.994036 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:22 crc kubenswrapper[4594]: I1129 05:29:22.994046 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:22Z","lastTransitionTime":"2025-11-29T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.083353 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.083371 4594 scope.go:117] "RemoveContainer" containerID="1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85" Nov 29 05:29:23 crc kubenswrapper[4594]: E1129 05:29:23.083458 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:23 crc kubenswrapper[4594]: E1129 05:29:23.083575 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.095502 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.095532 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.095543 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.095554 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.095562 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:23Z","lastTransitionTime":"2025-11-29T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.197454 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.197500 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.197510 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.197521 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.197529 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:23Z","lastTransitionTime":"2025-11-29T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.299521 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.299571 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.299579 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.299589 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.299596 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:23Z","lastTransitionTime":"2025-11-29T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.401434 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.401475 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.401484 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.401495 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.401502 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:23Z","lastTransitionTime":"2025-11-29T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.502773 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.502801 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.502809 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.502818 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.502825 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:23Z","lastTransitionTime":"2025-11-29T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.604824 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.604858 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.604867 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.604878 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.604886 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:23Z","lastTransitionTime":"2025-11-29T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.707126 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.707150 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.707158 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.707167 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.707173 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:23Z","lastTransitionTime":"2025-11-29T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.808861 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.808897 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.808906 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.808915 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.808922 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:23Z","lastTransitionTime":"2025-11-29T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.910457 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.910480 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.910488 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.910498 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:23 crc kubenswrapper[4594]: I1129 05:29:23.910508 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:23Z","lastTransitionTime":"2025-11-29T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.012060 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.012084 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.012095 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.012106 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.012115 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:24Z","lastTransitionTime":"2025-11-29T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.083215 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.083404 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:24 crc kubenswrapper[4594]: E1129 05:29:24.083812 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.083586 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:24 crc kubenswrapper[4594]: E1129 05:29:24.084021 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:24 crc kubenswrapper[4594]: E1129 05:29:24.083548 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.113946 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.113971 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.113979 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.113988 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.113995 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:24Z","lastTransitionTime":"2025-11-29T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.215554 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.215581 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.215591 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.215601 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.215609 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:24Z","lastTransitionTime":"2025-11-29T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.317298 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.317320 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.317328 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.317337 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.317344 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:24Z","lastTransitionTime":"2025-11-29T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.418394 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.418421 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.418430 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.418439 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.418448 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:24Z","lastTransitionTime":"2025-11-29T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.519889 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.519914 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.519924 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.519936 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.519945 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:24Z","lastTransitionTime":"2025-11-29T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.622189 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.622239 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.622281 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.622301 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.622316 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:24Z","lastTransitionTime":"2025-11-29T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.723715 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.723737 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.723746 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.723757 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.723764 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:24Z","lastTransitionTime":"2025-11-29T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.825305 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.825344 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.825354 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.825368 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.825378 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:24Z","lastTransitionTime":"2025-11-29T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.927144 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.927168 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.927177 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.927187 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:24 crc kubenswrapper[4594]: I1129 05:29:24.927194 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:24Z","lastTransitionTime":"2025-11-29T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.029178 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.029207 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.029214 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.029250 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.029272 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:25Z","lastTransitionTime":"2025-11-29T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.083187 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:25 crc kubenswrapper[4594]: E1129 05:29:25.083285 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.130847 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.130883 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.130892 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.130902 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.130908 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:25Z","lastTransitionTime":"2025-11-29T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.232535 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.232564 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.232573 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.232582 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.232590 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:25Z","lastTransitionTime":"2025-11-29T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.334504 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.334532 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.334539 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.334548 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.334554 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:25Z","lastTransitionTime":"2025-11-29T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.437126 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.437153 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.437161 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.437170 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.437178 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:25Z","lastTransitionTime":"2025-11-29T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.539060 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.539084 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.539093 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.539103 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.539111 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:25Z","lastTransitionTime":"2025-11-29T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.640474 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.640499 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.640505 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.640515 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.640522 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:25Z","lastTransitionTime":"2025-11-29T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.742028 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.742062 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.742072 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.742083 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.742092 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:25Z","lastTransitionTime":"2025-11-29T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.843587 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.843617 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.843628 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.843637 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.843645 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:25Z","lastTransitionTime":"2025-11-29T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.944594 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.944622 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.944630 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.944638 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:25 crc kubenswrapper[4594]: I1129 05:29:25.944645 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:25Z","lastTransitionTime":"2025-11-29T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.045943 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.045975 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.045984 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.045995 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.046003 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:26Z","lastTransitionTime":"2025-11-29T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.082599 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.082626 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.082631 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:26 crc kubenswrapper[4594]: E1129 05:29:26.082687 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:26 crc kubenswrapper[4594]: E1129 05:29:26.082827 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:26 crc kubenswrapper[4594]: E1129 05:29:26.083109 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.123392 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podStartSLOduration=69.123368699 podStartE2EDuration="1m9.123368699s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:29:26.115987059 +0000 UTC m=+90.356496280" watchObservedRunningTime="2025-11-29 05:29:26.123368699 +0000 UTC m=+90.363877919" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.132593 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=66.132580934 podStartE2EDuration="1m6.132580934s" podCreationTimestamp="2025-11-29 05:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:29:26.132244661 +0000 UTC m=+90.372753881" watchObservedRunningTime="2025-11-29 05:29:26.132580934 +0000 UTC m=+90.373090154" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.132728 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-26zjk" podStartSLOduration=69.132724114 podStartE2EDuration="1m9.132724114s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:29:26.123598301 +0000 UTC m=+90.364107522" watchObservedRunningTime="2025-11-29 05:29:26.132724114 +0000 UTC m=+90.373233334" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.147300 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.147329 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.147339 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.147350 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.147359 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:26Z","lastTransitionTime":"2025-11-29T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.149419 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=10.149410983 podStartE2EDuration="10.149410983s" podCreationTimestamp="2025-11-29 05:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:29:26.139765542 +0000 UTC m=+90.380274772" watchObservedRunningTime="2025-11-29 05:29:26.149410983 +0000 UTC m=+90.389920203" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.158078 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-87h4n" podStartSLOduration=69.158069937 podStartE2EDuration="1m9.158069937s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:29:26.158052274 +0000 UTC m=+90.398561493" watchObservedRunningTime="2025-11-29 05:29:26.158069937 +0000 UTC m=+90.398579157" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.178155 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.17813666 podStartE2EDuration="1m12.17813666s" podCreationTimestamp="2025-11-29 05:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:29:26.170071204 +0000 UTC m=+90.410580424" watchObservedRunningTime="2025-11-29 05:29:26.17813666 +0000 UTC m=+90.418645880" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.188210 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7plzz" podStartSLOduration=69.188196482 podStartE2EDuration="1m9.188196482s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:29:26.187434858 +0000 UTC m=+90.427944078" watchObservedRunningTime="2025-11-29 05:29:26.188196482 +0000 UTC m=+90.428705702" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.195708 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vn6fz" podStartSLOduration=69.195691104 podStartE2EDuration="1m9.195691104s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:29:26.195356193 +0000 UTC m=+90.435865413" watchObservedRunningTime="2025-11-29 05:29:26.195691104 +0000 UTC m=+90.436200323" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.221315 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.221298689 podStartE2EDuration="42.221298689s" podCreationTimestamp="2025-11-29 05:28:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:29:26.220386471 +0000 UTC m=+90.460895691" watchObservedRunningTime="2025-11-29 05:29:26.221298689 +0000 UTC m=+90.461807909" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.249649 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.249684 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.249694 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.249707 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.249718 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:26Z","lastTransitionTime":"2025-11-29T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.284104 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zwqcq" podStartSLOduration=69.284084067 podStartE2EDuration="1m9.284084067s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:29:26.266190235 +0000 UTC m=+90.506699455" watchObservedRunningTime="2025-11-29 05:29:26.284084067 +0000 UTC m=+90.524593287" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.284221 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=7.284215184 podStartE2EDuration="7.284215184s" podCreationTimestamp="2025-11-29 05:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:29:26.282807473 +0000 UTC m=+90.523316693" watchObservedRunningTime="2025-11-29 05:29:26.284215184 +0000 UTC m=+90.524724404" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.351989 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.352017 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.352026 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.352039 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.352052 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:26Z","lastTransitionTime":"2025-11-29T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.453939 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.453983 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.453993 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.454007 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.454016 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:26Z","lastTransitionTime":"2025-11-29T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.556468 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.556515 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.556525 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.556542 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.556551 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:26Z","lastTransitionTime":"2025-11-29T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.658573 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.658608 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.658619 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.658636 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.658647 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:26Z","lastTransitionTime":"2025-11-29T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.760778 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.760809 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.760818 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.760831 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.760839 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:26Z","lastTransitionTime":"2025-11-29T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.862811 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.862839 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.862850 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.862872 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.862881 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:26Z","lastTransitionTime":"2025-11-29T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.964520 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.964549 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.964587 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.964600 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:26 crc kubenswrapper[4594]: I1129 05:29:26.964609 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:26Z","lastTransitionTime":"2025-11-29T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.067049 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.067079 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.067089 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.067100 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.067108 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:27Z","lastTransitionTime":"2025-11-29T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.083003 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:27 crc kubenswrapper[4594]: E1129 05:29:27.083107 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.168902 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.168927 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.168938 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.168951 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.168981 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:27Z","lastTransitionTime":"2025-11-29T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.271060 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.271090 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.271099 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.271112 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.271122 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:27Z","lastTransitionTime":"2025-11-29T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.372918 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.372954 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.372962 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.372976 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.372987 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:27Z","lastTransitionTime":"2025-11-29T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.474989 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.475023 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.475032 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.475046 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.475054 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:27Z","lastTransitionTime":"2025-11-29T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.576564 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.576619 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.576638 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.576651 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.576661 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:27Z","lastTransitionTime":"2025-11-29T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.678786 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.678849 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.678858 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.678876 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.678895 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:27Z","lastTransitionTime":"2025-11-29T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.780216 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.780269 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.780279 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.780290 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.780298 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:27Z","lastTransitionTime":"2025-11-29T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.882173 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.882209 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.882221 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.882233 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.882296 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:27Z","lastTransitionTime":"2025-11-29T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.984469 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.984500 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.984509 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.984521 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:27 crc kubenswrapper[4594]: I1129 05:29:27.984529 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:27Z","lastTransitionTime":"2025-11-29T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.083085 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.083136 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.083437 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:28 crc kubenswrapper[4594]: E1129 05:29:28.083526 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:28 crc kubenswrapper[4594]: E1129 05:29:28.083632 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:28 crc kubenswrapper[4594]: E1129 05:29:28.083686 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.085916 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.085943 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.085953 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.085964 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.085972 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:28Z","lastTransitionTime":"2025-11-29T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.187849 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.187885 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.187897 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.187909 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.187919 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:28Z","lastTransitionTime":"2025-11-29T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.290111 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.290146 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.290154 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.290174 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.290182 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:28Z","lastTransitionTime":"2025-11-29T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.391715 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.391753 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.391762 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.391775 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.391784 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:28Z","lastTransitionTime":"2025-11-29T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.392501 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.392530 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.392540 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.392551 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.392571 4594 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T05:29:28Z","lastTransitionTime":"2025-11-29T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.419356 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk"] Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.419727 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.421233 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.421940 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.421974 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.423392 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.567831 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f38a483c-eee1-4590-853d-cc73ebb6728b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.567892 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f38a483c-eee1-4590-853d-cc73ebb6728b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.567919 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f38a483c-eee1-4590-853d-cc73ebb6728b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.567947 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f38a483c-eee1-4590-853d-cc73ebb6728b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.567984 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f38a483c-eee1-4590-853d-cc73ebb6728b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.669247 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f38a483c-eee1-4590-853d-cc73ebb6728b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.669498 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f38a483c-eee1-4590-853d-cc73ebb6728b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.669527 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f38a483c-eee1-4590-853d-cc73ebb6728b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.669589 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f38a483c-eee1-4590-853d-cc73ebb6728b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.669628 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f38a483c-eee1-4590-853d-cc73ebb6728b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.669679 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f38a483c-eee1-4590-853d-cc73ebb6728b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.670350 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f38a483c-eee1-4590-853d-cc73ebb6728b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.670360 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f38a483c-eee1-4590-853d-cc73ebb6728b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.676440 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f38a483c-eee1-4590-853d-cc73ebb6728b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.685451 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f38a483c-eee1-4590-853d-cc73ebb6728b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lhbfk\" (UID: \"f38a483c-eee1-4590-853d-cc73ebb6728b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:28 crc kubenswrapper[4594]: I1129 05:29:28.730643 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" Nov 29 05:29:29 crc kubenswrapper[4594]: I1129 05:29:29.082556 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:29 crc kubenswrapper[4594]: E1129 05:29:29.082691 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:29 crc kubenswrapper[4594]: I1129 05:29:29.488077 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" event={"ID":"f38a483c-eee1-4590-853d-cc73ebb6728b","Type":"ContainerStarted","Data":"a8088ca64a2f4a07ca75a64d453e5fcd8e6c08ea2d4942e924b16b719a42b0e2"} Nov 29 05:29:29 crc kubenswrapper[4594]: I1129 05:29:29.488128 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" event={"ID":"f38a483c-eee1-4590-853d-cc73ebb6728b","Type":"ContainerStarted","Data":"a69435eede423c418c69abbaa4091ac62f9be15ff7498be3970cddc338c2f397"} Nov 29 05:29:29 crc kubenswrapper[4594]: I1129 05:29:29.498420 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhbfk" podStartSLOduration=72.498407189 podStartE2EDuration="1m12.498407189s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:29:29.497469133 +0000 UTC m=+93.737978353" watchObservedRunningTime="2025-11-29 05:29:29.498407189 +0000 UTC m=+93.738916409" Nov 29 05:29:30 crc kubenswrapper[4594]: I1129 05:29:30.082819 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:30 crc kubenswrapper[4594]: E1129 05:29:30.083282 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:30 crc kubenswrapper[4594]: I1129 05:29:30.082858 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:30 crc kubenswrapper[4594]: I1129 05:29:30.082950 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:30 crc kubenswrapper[4594]: E1129 05:29:30.083450 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:30 crc kubenswrapper[4594]: E1129 05:29:30.083481 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:31 crc kubenswrapper[4594]: I1129 05:29:31.082949 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:31 crc kubenswrapper[4594]: E1129 05:29:31.083053 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:32 crc kubenswrapper[4594]: I1129 05:29:32.083076 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:32 crc kubenswrapper[4594]: I1129 05:29:32.083117 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:32 crc kubenswrapper[4594]: E1129 05:29:32.083169 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:32 crc kubenswrapper[4594]: I1129 05:29:32.083178 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:32 crc kubenswrapper[4594]: E1129 05:29:32.083238 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:32 crc kubenswrapper[4594]: E1129 05:29:32.083367 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:33 crc kubenswrapper[4594]: I1129 05:29:33.082567 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:33 crc kubenswrapper[4594]: E1129 05:29:33.082728 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:34 crc kubenswrapper[4594]: I1129 05:29:34.082984 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:34 crc kubenswrapper[4594]: E1129 05:29:34.083094 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:34 crc kubenswrapper[4594]: I1129 05:29:34.083121 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:34 crc kubenswrapper[4594]: I1129 05:29:34.083277 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:34 crc kubenswrapper[4594]: E1129 05:29:34.083602 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:34 crc kubenswrapper[4594]: E1129 05:29:34.083717 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:34 crc kubenswrapper[4594]: I1129 05:29:34.925808 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:34 crc kubenswrapper[4594]: E1129 05:29:34.925985 4594 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:29:34 crc kubenswrapper[4594]: E1129 05:29:34.926053 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs podName:217088b9-a48b-40c7-8d83-f9ff0eb24908 nodeName:}" failed. No retries permitted until 2025-11-29 05:30:38.92603349 +0000 UTC m=+163.166542710 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs") pod "network-metrics-daemon-lzr56" (UID: "217088b9-a48b-40c7-8d83-f9ff0eb24908") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 05:29:35 crc kubenswrapper[4594]: I1129 05:29:35.082764 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:35 crc kubenswrapper[4594]: E1129 05:29:35.082875 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:36 crc kubenswrapper[4594]: I1129 05:29:36.083378 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:36 crc kubenswrapper[4594]: E1129 05:29:36.083460 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:36 crc kubenswrapper[4594]: I1129 05:29:36.083624 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:36 crc kubenswrapper[4594]: E1129 05:29:36.084622 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:36 crc kubenswrapper[4594]: I1129 05:29:36.084737 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:36 crc kubenswrapper[4594]: E1129 05:29:36.085044 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:36 crc kubenswrapper[4594]: I1129 05:29:36.085150 4594 scope.go:117] "RemoveContainer" containerID="1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85" Nov 29 05:29:36 crc kubenswrapper[4594]: E1129 05:29:36.085286 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lp4zm_openshift-ovn-kubernetes(08de5891-72ca-488c-80b3-6b54c8c1a66e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" Nov 29 05:29:37 crc kubenswrapper[4594]: I1129 05:29:37.082947 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:37 crc kubenswrapper[4594]: E1129 05:29:37.083039 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:38 crc kubenswrapper[4594]: I1129 05:29:38.082498 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:38 crc kubenswrapper[4594]: I1129 05:29:38.082505 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:38 crc kubenswrapper[4594]: E1129 05:29:38.082575 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:38 crc kubenswrapper[4594]: I1129 05:29:38.082603 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:38 crc kubenswrapper[4594]: E1129 05:29:38.082658 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:38 crc kubenswrapper[4594]: E1129 05:29:38.082702 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:39 crc kubenswrapper[4594]: I1129 05:29:39.082483 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:39 crc kubenswrapper[4594]: E1129 05:29:39.082601 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:40 crc kubenswrapper[4594]: I1129 05:29:40.083445 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:40 crc kubenswrapper[4594]: I1129 05:29:40.083522 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:40 crc kubenswrapper[4594]: I1129 05:29:40.083503 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:40 crc kubenswrapper[4594]: E1129 05:29:40.083720 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:40 crc kubenswrapper[4594]: E1129 05:29:40.083798 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:40 crc kubenswrapper[4594]: E1129 05:29:40.083880 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:41 crc kubenswrapper[4594]: I1129 05:29:41.083696 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:41 crc kubenswrapper[4594]: E1129 05:29:41.084427 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:42 crc kubenswrapper[4594]: I1129 05:29:42.083497 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:42 crc kubenswrapper[4594]: I1129 05:29:42.083531 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:42 crc kubenswrapper[4594]: I1129 05:29:42.083587 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:42 crc kubenswrapper[4594]: E1129 05:29:42.083784 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:42 crc kubenswrapper[4594]: E1129 05:29:42.083863 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:42 crc kubenswrapper[4594]: E1129 05:29:42.083985 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:43 crc kubenswrapper[4594]: I1129 05:29:43.082760 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:43 crc kubenswrapper[4594]: E1129 05:29:43.082885 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:44 crc kubenswrapper[4594]: I1129 05:29:44.083477 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:44 crc kubenswrapper[4594]: I1129 05:29:44.083516 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:44 crc kubenswrapper[4594]: E1129 05:29:44.083572 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:44 crc kubenswrapper[4594]: I1129 05:29:44.083516 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:44 crc kubenswrapper[4594]: E1129 05:29:44.083686 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:44 crc kubenswrapper[4594]: E1129 05:29:44.083623 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:45 crc kubenswrapper[4594]: I1129 05:29:45.083332 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:45 crc kubenswrapper[4594]: E1129 05:29:45.083450 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:46 crc kubenswrapper[4594]: I1129 05:29:46.082845 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:46 crc kubenswrapper[4594]: I1129 05:29:46.082922 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:46 crc kubenswrapper[4594]: I1129 05:29:46.083054 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:46 crc kubenswrapper[4594]: E1129 05:29:46.083991 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:46 crc kubenswrapper[4594]: E1129 05:29:46.084120 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:46 crc kubenswrapper[4594]: E1129 05:29:46.084295 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:47 crc kubenswrapper[4594]: I1129 05:29:47.082656 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:47 crc kubenswrapper[4594]: E1129 05:29:47.082773 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:48 crc kubenswrapper[4594]: I1129 05:29:48.083800 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:48 crc kubenswrapper[4594]: E1129 05:29:48.083911 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:48 crc kubenswrapper[4594]: I1129 05:29:48.084115 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:48 crc kubenswrapper[4594]: E1129 05:29:48.084166 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:48 crc kubenswrapper[4594]: I1129 05:29:48.084452 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:48 crc kubenswrapper[4594]: E1129 05:29:48.084897 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:49 crc kubenswrapper[4594]: I1129 05:29:49.083587 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:49 crc kubenswrapper[4594]: E1129 05:29:49.083761 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:50 crc kubenswrapper[4594]: I1129 05:29:50.082659 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:50 crc kubenswrapper[4594]: I1129 05:29:50.082673 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:50 crc kubenswrapper[4594]: E1129 05:29:50.082917 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:50 crc kubenswrapper[4594]: I1129 05:29:50.082956 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:50 crc kubenswrapper[4594]: E1129 05:29:50.083030 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:50 crc kubenswrapper[4594]: E1129 05:29:50.083206 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:50 crc kubenswrapper[4594]: I1129 05:29:50.083793 4594 scope.go:117] "RemoveContainer" containerID="1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85" Nov 29 05:29:50 crc kubenswrapper[4594]: I1129 05:29:50.549575 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/3.log" Nov 29 05:29:50 crc kubenswrapper[4594]: I1129 05:29:50.553959 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerStarted","Data":"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53"} Nov 29 05:29:50 crc kubenswrapper[4594]: I1129 05:29:50.554371 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:29:50 crc kubenswrapper[4594]: I1129 05:29:50.580615 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podStartSLOduration=93.580601529 podStartE2EDuration="1m33.580601529s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:29:50.579719419 +0000 UTC m=+114.820228639" watchObservedRunningTime="2025-11-29 05:29:50.580601529 +0000 UTC m=+114.821110749" Nov 29 05:29:50 crc kubenswrapper[4594]: I1129 05:29:50.763977 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lzr56"] Nov 29 05:29:50 crc kubenswrapper[4594]: I1129 05:29:50.764119 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:50 crc kubenswrapper[4594]: E1129 05:29:50.764220 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:51 crc kubenswrapper[4594]: I1129 05:29:51.558707 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7plzz_e5052790-d231-4f97-802c-c7de3cd72561/kube-multus/1.log" Nov 29 05:29:51 crc kubenswrapper[4594]: I1129 05:29:51.559162 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7plzz_e5052790-d231-4f97-802c-c7de3cd72561/kube-multus/0.log" Nov 29 05:29:51 crc kubenswrapper[4594]: I1129 05:29:51.559204 4594 generic.go:334] "Generic (PLEG): container finished" podID="e5052790-d231-4f97-802c-c7de3cd72561" containerID="abc4e229bc1c5698fec5a7e0da8decb0167770496e7f8dce7a9a9276721ee26c" exitCode=1 Nov 29 05:29:51 crc kubenswrapper[4594]: I1129 05:29:51.559300 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7plzz" event={"ID":"e5052790-d231-4f97-802c-c7de3cd72561","Type":"ContainerDied","Data":"abc4e229bc1c5698fec5a7e0da8decb0167770496e7f8dce7a9a9276721ee26c"} Nov 29 05:29:51 crc kubenswrapper[4594]: I1129 05:29:51.559366 4594 scope.go:117] "RemoveContainer" containerID="4c23b1472cd349ae6a56137165463429204ffbaf2c1b477d3ba9d07d2f2799f9" Nov 29 05:29:51 crc kubenswrapper[4594]: I1129 05:29:51.559856 4594 scope.go:117] "RemoveContainer" containerID="abc4e229bc1c5698fec5a7e0da8decb0167770496e7f8dce7a9a9276721ee26c" Nov 29 05:29:51 crc kubenswrapper[4594]: E1129 05:29:51.560014 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-7plzz_openshift-multus(e5052790-d231-4f97-802c-c7de3cd72561)\"" pod="openshift-multus/multus-7plzz" podUID="e5052790-d231-4f97-802c-c7de3cd72561" Nov 29 05:29:52 crc kubenswrapper[4594]: I1129 05:29:52.083086 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:52 crc kubenswrapper[4594]: I1129 05:29:52.083131 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:52 crc kubenswrapper[4594]: I1129 05:29:52.083130 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:52 crc kubenswrapper[4594]: I1129 05:29:52.083225 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:52 crc kubenswrapper[4594]: E1129 05:29:52.083854 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:52 crc kubenswrapper[4594]: E1129 05:29:52.083998 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:52 crc kubenswrapper[4594]: E1129 05:29:52.084098 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:52 crc kubenswrapper[4594]: E1129 05:29:52.083892 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:52 crc kubenswrapper[4594]: I1129 05:29:52.563626 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7plzz_e5052790-d231-4f97-802c-c7de3cd72561/kube-multus/1.log" Nov 29 05:29:54 crc kubenswrapper[4594]: I1129 05:29:54.083513 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:54 crc kubenswrapper[4594]: I1129 05:29:54.083582 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:54 crc kubenswrapper[4594]: I1129 05:29:54.083602 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:54 crc kubenswrapper[4594]: E1129 05:29:54.083637 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:54 crc kubenswrapper[4594]: E1129 05:29:54.083698 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:54 crc kubenswrapper[4594]: I1129 05:29:54.083774 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:54 crc kubenswrapper[4594]: E1129 05:29:54.083855 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:54 crc kubenswrapper[4594]: E1129 05:29:54.083970 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:56 crc kubenswrapper[4594]: I1129 05:29:56.083034 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:56 crc kubenswrapper[4594]: I1129 05:29:56.083036 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:56 crc kubenswrapper[4594]: I1129 05:29:56.083086 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:56 crc kubenswrapper[4594]: I1129 05:29:56.083054 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:56 crc kubenswrapper[4594]: E1129 05:29:56.084217 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:56 crc kubenswrapper[4594]: E1129 05:29:56.084298 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:56 crc kubenswrapper[4594]: E1129 05:29:56.084417 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:29:56 crc kubenswrapper[4594]: E1129 05:29:56.084505 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:56 crc kubenswrapper[4594]: E1129 05:29:56.144458 4594 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 29 05:29:56 crc kubenswrapper[4594]: E1129 05:29:56.151030 4594 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 05:29:58 crc kubenswrapper[4594]: I1129 05:29:58.083377 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:29:58 crc kubenswrapper[4594]: I1129 05:29:58.083479 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:29:58 crc kubenswrapper[4594]: E1129 05:29:58.083504 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:29:58 crc kubenswrapper[4594]: I1129 05:29:58.084180 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:29:58 crc kubenswrapper[4594]: I1129 05:29:58.084700 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:29:58 crc kubenswrapper[4594]: E1129 05:29:58.084705 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:29:58 crc kubenswrapper[4594]: E1129 05:29:58.085135 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:29:58 crc kubenswrapper[4594]: E1129 05:29:58.085372 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:30:00 crc kubenswrapper[4594]: I1129 05:30:00.082687 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:30:00 crc kubenswrapper[4594]: I1129 05:30:00.082753 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:30:00 crc kubenswrapper[4594]: E1129 05:30:00.082807 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:30:00 crc kubenswrapper[4594]: E1129 05:30:00.082925 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:30:00 crc kubenswrapper[4594]: I1129 05:30:00.082700 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:30:00 crc kubenswrapper[4594]: E1129 05:30:00.083040 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:30:00 crc kubenswrapper[4594]: I1129 05:30:00.083068 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:30:00 crc kubenswrapper[4594]: E1129 05:30:00.083122 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:30:01 crc kubenswrapper[4594]: E1129 05:30:01.152564 4594 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 05:30:02 crc kubenswrapper[4594]: I1129 05:30:02.083427 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:30:02 crc kubenswrapper[4594]: I1129 05:30:02.083489 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:30:02 crc kubenswrapper[4594]: E1129 05:30:02.083563 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:30:02 crc kubenswrapper[4594]: I1129 05:30:02.083602 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:30:02 crc kubenswrapper[4594]: I1129 05:30:02.083695 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:30:02 crc kubenswrapper[4594]: E1129 05:30:02.083769 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:30:02 crc kubenswrapper[4594]: E1129 05:30:02.083884 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:30:02 crc kubenswrapper[4594]: E1129 05:30:02.084074 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:30:04 crc kubenswrapper[4594]: I1129 05:30:04.082904 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:30:04 crc kubenswrapper[4594]: I1129 05:30:04.082936 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:30:04 crc kubenswrapper[4594]: E1129 05:30:04.083053 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:30:04 crc kubenswrapper[4594]: I1129 05:30:04.083112 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:30:04 crc kubenswrapper[4594]: E1129 05:30:04.083271 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:30:04 crc kubenswrapper[4594]: E1129 05:30:04.083342 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:30:04 crc kubenswrapper[4594]: I1129 05:30:04.083477 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:30:04 crc kubenswrapper[4594]: E1129 05:30:04.083558 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:30:05 crc kubenswrapper[4594]: I1129 05:30:05.083125 4594 scope.go:117] "RemoveContainer" containerID="abc4e229bc1c5698fec5a7e0da8decb0167770496e7f8dce7a9a9276721ee26c" Nov 29 05:30:05 crc kubenswrapper[4594]: I1129 05:30:05.601925 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7plzz_e5052790-d231-4f97-802c-c7de3cd72561/kube-multus/1.log" Nov 29 05:30:05 crc kubenswrapper[4594]: I1129 05:30:05.602318 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7plzz" event={"ID":"e5052790-d231-4f97-802c-c7de3cd72561","Type":"ContainerStarted","Data":"fe1439b8aee18c36fcc41d5a5289107a458927f46cfb33947018b6cd3f7bee68"} Nov 29 05:30:06 crc kubenswrapper[4594]: I1129 05:30:06.083013 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:30:06 crc kubenswrapper[4594]: I1129 05:30:06.083040 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:30:06 crc kubenswrapper[4594]: I1129 05:30:06.083086 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:30:06 crc kubenswrapper[4594]: E1129 05:30:06.083876 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:30:06 crc kubenswrapper[4594]: I1129 05:30:06.083893 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:30:06 crc kubenswrapper[4594]: E1129 05:30:06.084028 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:30:06 crc kubenswrapper[4594]: E1129 05:30:06.084050 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:30:06 crc kubenswrapper[4594]: E1129 05:30:06.084092 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:30:06 crc kubenswrapper[4594]: E1129 05:30:06.153006 4594 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 05:30:08 crc kubenswrapper[4594]: I1129 05:30:08.083273 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:30:08 crc kubenswrapper[4594]: I1129 05:30:08.083313 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:30:08 crc kubenswrapper[4594]: E1129 05:30:08.083405 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:30:08 crc kubenswrapper[4594]: I1129 05:30:08.083610 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:30:08 crc kubenswrapper[4594]: I1129 05:30:08.083640 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:30:08 crc kubenswrapper[4594]: E1129 05:30:08.083687 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:30:08 crc kubenswrapper[4594]: E1129 05:30:08.083795 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:30:08 crc kubenswrapper[4594]: E1129 05:30:08.083941 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:30:10 crc kubenswrapper[4594]: I1129 05:30:10.082709 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:30:10 crc kubenswrapper[4594]: I1129 05:30:10.082805 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:30:10 crc kubenswrapper[4594]: I1129 05:30:10.082823 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:30:10 crc kubenswrapper[4594]: I1129 05:30:10.082851 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:30:10 crc kubenswrapper[4594]: E1129 05:30:10.082897 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 05:30:10 crc kubenswrapper[4594]: E1129 05:30:10.083094 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 05:30:10 crc kubenswrapper[4594]: E1129 05:30:10.083211 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 05:30:10 crc kubenswrapper[4594]: E1129 05:30:10.083278 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzr56" podUID="217088b9-a48b-40c7-8d83-f9ff0eb24908" Nov 29 05:30:12 crc kubenswrapper[4594]: I1129 05:30:12.082469 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:30:12 crc kubenswrapper[4594]: I1129 05:30:12.082634 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:30:12 crc kubenswrapper[4594]: I1129 05:30:12.082689 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:30:12 crc kubenswrapper[4594]: I1129 05:30:12.082892 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:30:12 crc kubenswrapper[4594]: I1129 05:30:12.084203 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 29 05:30:12 crc kubenswrapper[4594]: I1129 05:30:12.084454 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 29 05:30:12 crc kubenswrapper[4594]: I1129 05:30:12.085234 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 29 05:30:12 crc kubenswrapper[4594]: I1129 05:30:12.085633 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 29 05:30:12 crc kubenswrapper[4594]: I1129 05:30:12.085850 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 29 05:30:12 crc kubenswrapper[4594]: I1129 05:30:12.086540 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.641910 4594 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.667123 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.667503 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.668527 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.668906 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.669658 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.669858 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.669951 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.670025 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.670135 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.670002 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.670376 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.670461 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f9dqp"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.670569 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.676642 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.676757 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.676833 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.676649 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-x5tn8"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.677205 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.677395 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-scxzk"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.677745 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.677788 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.678964 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jgzzn"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.679626 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.679944 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.682348 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.684567 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.684785 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.685180 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.686303 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.686433 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.686571 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.686859 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.686978 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.687109 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.687219 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.687538 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.687683 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.688005 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.688426 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.688672 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.689099 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.689179 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.689369 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.689449 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-9ctf8"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.689727 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.690033 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.690400 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.690690 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.690784 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.690909 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.692243 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.692499 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.692624 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.692748 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.694441 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.694562 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.694782 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.694898 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.694981 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-trhms"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.695497 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.695870 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gv8n5"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.696428 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.696572 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.696649 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.697787 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.697911 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.698146 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.698357 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-xg8s7"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.698411 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.698487 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.698572 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.698597 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xg8s7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.698655 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.698720 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.698756 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gv8n5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.698794 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.698870 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.698942 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.699071 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.699140 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.699214 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.699300 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.699408 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.699496 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.700035 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.700139 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.700435 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.702589 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.702690 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.702908 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jsnt5"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.703097 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rwbhx"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.703338 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.703623 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.703816 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.703817 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.704299 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.704843 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.704926 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.704988 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.705051 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.713470 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.713774 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.714022 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.714187 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.714562 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.714702 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.715633 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.716443 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wpkbq"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.716525 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.717197 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.717962 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.718386 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.718671 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.718828 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.719048 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.719107 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.720204 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.720246 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.720457 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.720715 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.723492 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.723568 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.725643 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.733954 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.734275 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.734399 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.734900 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.735726 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.736026 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.736157 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.736183 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.736536 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.739309 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.739720 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.739322 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.740202 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.740577 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.740673 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.740762 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.741206 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.743113 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.745382 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.745701 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.745964 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8pvtn"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.746051 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.746227 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.747409 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.747974 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.749814 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.749843 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.750369 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.751142 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-g9rtk"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.751540 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9rtk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.751666 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.751931 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.752778 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.753299 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.753694 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pjh6b"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.754079 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pjh6b" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.754986 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k9wmb"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.756472 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.758950 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qkgqz"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.759245 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.759515 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.759798 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.760113 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.760269 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.767491 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.770781 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.769548 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.773423 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.782991 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.783039 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.783632 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-scxzk"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.783762 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.784052 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.786523 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.787773 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-chxbx"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788378 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-image-import-ca\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788422 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn82r\" (UniqueName: \"kubernetes.io/projected/fd86a99b-7671-4dd9-88b9-334db1906b6b-kube-api-access-wn82r\") pod \"openshift-config-operator-7777fb866f-trhms\" (UID: \"fd86a99b-7671-4dd9-88b9-334db1906b6b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788452 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8b141ba2-87a7-4975-a1ed-054769e567bd-machine-approver-tls\") pod \"machine-approver-56656f9798-8t4sg\" (UID: \"8b141ba2-87a7-4975-a1ed-054769e567bd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788473 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/81534f91-740b-487b-9149-e8565ccf9905-etcd-client\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788494 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad8c6194-f489-42f3-afb2-707a97b490ad-proxy-tls\") pod \"machine-config-controller-84d6567774-p9km8\" (UID: \"ad8c6194-f489-42f3-afb2-707a97b490ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788512 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/969b3fd9-7b46-4627-bda2-e2fb4a4f5203-config\") pod \"kube-controller-manager-operator-78b949d7b-zqmhf\" (UID: \"969b3fd9-7b46-4627-bda2-e2fb4a4f5203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788531 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g786\" (UniqueName: \"kubernetes.io/projected/cce54ab2-f931-4499-a0c1-8a2b2674aeb2-kube-api-access-4g786\") pod \"cluster-image-registry-operator-dc59b4c8b-4qlf7\" (UID: \"cce54ab2-f931-4499-a0c1-8a2b2674aeb2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788549 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-audit-policies\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788567 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788635 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ad8c6194-f489-42f3-afb2-707a97b490ad-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-p9km8\" (UID: \"ad8c6194-f489-42f3-afb2-707a97b490ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788700 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3-config\") pod \"console-operator-58897d9998-wpkbq\" (UID: \"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3\") " pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788758 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cce54ab2-f931-4499-a0c1-8a2b2674aeb2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4qlf7\" (UID: \"cce54ab2-f931-4499-a0c1-8a2b2674aeb2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788792 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/873587ac-5d12-45c4-bfd5-42bc08d29f65-node-pullsecrets\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788824 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/911e02cc-6184-4c63-a36c-a51b54d0e7bf-serving-cert\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.788967 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.789020 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-client-ca\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.789061 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-khv8v\" (UID: \"637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.789111 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thlm7\" (UniqueName: \"kubernetes.io/projected/93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3-kube-api-access-thlm7\") pod \"console-operator-58897d9998-wpkbq\" (UID: \"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3\") " pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.789146 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mvjv\" (UniqueName: \"kubernetes.io/projected/652260eb-911a-4b92-8cea-73c9e8f156c4-kube-api-access-7mvjv\") pod \"route-controller-manager-6576b87f9c-zvvn6\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.789205 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-trusted-ca-bundle\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.789234 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dzrs\" (UniqueName: \"kubernetes.io/projected/911e02cc-6184-4c63-a36c-a51b54d0e7bf-kube-api-access-2dzrs\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.789286 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxbv4\" (UniqueName: \"kubernetes.io/projected/864d1c0b-3c85-4472-9d16-c8d5c574e02a-kube-api-access-zxbv4\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.789643 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.789981 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8b141ba2-87a7-4975-a1ed-054769e567bd-auth-proxy-config\") pod \"machine-approver-56656f9798-8t4sg\" (UID: \"8b141ba2-87a7-4975-a1ed-054769e567bd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790035 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8797308e-2c6a-45d4-a7e8-dae633de1103-service-ca-bundle\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790066 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbpfn\" (UniqueName: \"kubernetes.io/projected/12b5360d-755a-4cb5-9ef3-0c00550e3913-kube-api-access-lbpfn\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790093 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/652260eb-911a-4b92-8cea-73c9e8f156c4-client-ca\") pod \"route-controller-manager-6576b87f9c-zvvn6\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790135 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790156 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/911e02cc-6184-4c63-a36c-a51b54d0e7bf-etcd-ca\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790183 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790216 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx8hm\" (UniqueName: \"kubernetes.io/projected/6c685741-a472-49fe-8237-520bea0232ef-kube-api-access-fx8hm\") pod \"downloads-7954f5f757-xg8s7\" (UID: \"6c685741-a472-49fe-8237-520bea0232ef\") " pod="openshift-console/downloads-7954f5f757-xg8s7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790242 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790278 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/837571cc-8fd2-4102-9997-4520ddf6da08-metrics-tls\") pod \"ingress-operator-5b745b69d9-gzzjt\" (UID: \"837571cc-8fd2-4102-9997-4520ddf6da08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790303 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790339 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/837571cc-8fd2-4102-9997-4520ddf6da08-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gzzjt\" (UID: \"837571cc-8fd2-4102-9997-4520ddf6da08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790354 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-config\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790393 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-config\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790409 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790543 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-etcd-serving-ca\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790572 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nftmj\" (UniqueName: \"kubernetes.io/projected/637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5-kube-api-access-nftmj\") pod \"cluster-samples-operator-665b6dd947-khv8v\" (UID: \"637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790592 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm7xc\" (UniqueName: \"kubernetes.io/projected/1fea2129-7ad0-45d8-9447-315107ef1c0c-kube-api-access-cm7xc\") pod \"control-plane-machine-set-operator-78cbb6b69f-gspdt\" (UID: \"1fea2129-7ad0-45d8-9447-315107ef1c0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790615 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8797308e-2c6a-45d4-a7e8-dae633de1103-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790617 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790655 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-oauth-serving-cert\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790673 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57xjr\" (UniqueName: \"kubernetes.io/projected/8b141ba2-87a7-4975-a1ed-054769e567bd-kube-api-access-57xjr\") pod \"machine-approver-56656f9798-8t4sg\" (UID: \"8b141ba2-87a7-4975-a1ed-054769e567bd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790754 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k444s\" (UniqueName: \"kubernetes.io/projected/873587ac-5d12-45c4-bfd5-42bc08d29f65-kube-api-access-k444s\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790787 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8797308e-2c6a-45d4-a7e8-dae633de1103-config\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790815 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-serving-cert\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790844 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-oauth-config\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790881 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b141ba2-87a7-4975-a1ed-054769e567bd-config\") pod \"machine-approver-56656f9798-8t4sg\" (UID: \"8b141ba2-87a7-4975-a1ed-054769e567bd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790929 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790950 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/873587ac-5d12-45c4-bfd5-42bc08d29f65-encryption-config\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.790966 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/911e02cc-6184-4c63-a36c-a51b54d0e7bf-etcd-service-ca\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791005 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpjr7\" (UniqueName: \"kubernetes.io/projected/ad8c6194-f489-42f3-afb2-707a97b490ad-kube-api-access-vpjr7\") pod \"machine-config-controller-84d6567774-p9km8\" (UID: \"ad8c6194-f489-42f3-afb2-707a97b490ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791051 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3-trusted-ca\") pod \"console-operator-58897d9998-wpkbq\" (UID: \"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3\") " pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791096 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81534f91-740b-487b-9149-e8565ccf9905-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791114 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3-serving-cert\") pod \"console-operator-58897d9998-wpkbq\" (UID: \"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3\") " pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791132 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/837571cc-8fd2-4102-9997-4520ddf6da08-trusted-ca\") pod \"ingress-operator-5b745b69d9-gzzjt\" (UID: \"837571cc-8fd2-4102-9997-4520ddf6da08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791149 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/388fd604-2764-4184-b305-ce8f8e31ffa1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2fqlt\" (UID: \"388fd604-2764-4184-b305-ce8f8e31ffa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791165 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fd86a99b-7671-4dd9-88b9-334db1906b6b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-trhms\" (UID: \"fd86a99b-7671-4dd9-88b9-334db1906b6b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791190 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/911e02cc-6184-4c63-a36c-a51b54d0e7bf-etcd-client\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791275 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09980e91-b2e4-4a0e-bee7-dc101096f804-config\") pod \"machine-api-operator-5694c8668f-f9dqp\" (UID: \"09980e91-b2e4-4a0e-bee7-dc101096f804\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791298 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/81534f91-740b-487b-9149-e8565ccf9905-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791329 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/81534f91-740b-487b-9149-e8565ccf9905-encryption-config\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791391 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/388fd604-2764-4184-b305-ce8f8e31ffa1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2fqlt\" (UID: \"388fd604-2764-4184-b305-ce8f8e31ffa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791412 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791429 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791447 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/864d1c0b-3c85-4472-9d16-c8d5c574e02a-serving-cert\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791469 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/873587ac-5d12-45c4-bfd5-42bc08d29f65-audit-dir\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791569 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/09980e91-b2e4-4a0e-bee7-dc101096f804-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f9dqp\" (UID: \"09980e91-b2e4-4a0e-bee7-dc101096f804\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791607 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81534f91-740b-487b-9149-e8565ccf9905-audit-dir\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791665 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-service-ca\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.791752 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmkkq\" (UniqueName: \"kubernetes.io/projected/09980e91-b2e4-4a0e-bee7-dc101096f804-kube-api-access-dmkkq\") pod \"machine-api-operator-5694c8668f-f9dqp\" (UID: \"09980e91-b2e4-4a0e-bee7-dc101096f804\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792330 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96330a2a-06f0-4396-bceb-49bf9e3f3407-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4n7dc\" (UID: \"96330a2a-06f0-4396-bceb-49bf9e3f3407\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792377 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/969b3fd9-7b46-4627-bda2-e2fb4a4f5203-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zqmhf\" (UID: \"969b3fd9-7b46-4627-bda2-e2fb4a4f5203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792399 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcrl6\" (UniqueName: \"kubernetes.io/projected/837571cc-8fd2-4102-9997-4520ddf6da08-kube-api-access-tcrl6\") pod \"ingress-operator-5b745b69d9-gzzjt\" (UID: \"837571cc-8fd2-4102-9997-4520ddf6da08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792418 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911e02cc-6184-4c63-a36c-a51b54d0e7bf-config\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792472 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fea2129-7ad0-45d8-9447-315107ef1c0c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gspdt\" (UID: \"1fea2129-7ad0-45d8-9447-315107ef1c0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792495 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/81534f91-740b-487b-9149-e8565ccf9905-audit-policies\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792512 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792549 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cce54ab2-f931-4499-a0c1-8a2b2674aeb2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4qlf7\" (UID: \"cce54ab2-f931-4499-a0c1-8a2b2674aeb2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792573 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/873587ac-5d12-45c4-bfd5-42bc08d29f65-etcd-client\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792735 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd86a99b-7671-4dd9-88b9-334db1906b6b-serving-cert\") pod \"openshift-config-operator-7777fb866f-trhms\" (UID: \"fd86a99b-7671-4dd9-88b9-334db1906b6b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792765 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8xmm\" (UniqueName: \"kubernetes.io/projected/81534f91-740b-487b-9149-e8565ccf9905-kube-api-access-f8xmm\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792783 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/388fd604-2764-4184-b305-ce8f8e31ffa1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2fqlt\" (UID: \"388fd604-2764-4184-b305-ce8f8e31ffa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792800 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/652260eb-911a-4b92-8cea-73c9e8f156c4-serving-cert\") pod \"route-controller-manager-6576b87f9c-zvvn6\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792814 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792835 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95f8s\" (UniqueName: \"kubernetes.io/projected/fc2b6738-957d-4636-a723-c11207f29087-kube-api-access-95f8s\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792856 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cce54ab2-f931-4499-a0c1-8a2b2674aeb2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4qlf7\" (UID: \"cce54ab2-f931-4499-a0c1-8a2b2674aeb2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792886 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81534f91-740b-487b-9149-e8565ccf9905-serving-cert\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792900 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/969b3fd9-7b46-4627-bda2-e2fb4a4f5203-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zqmhf\" (UID: \"969b3fd9-7b46-4627-bda2-e2fb4a4f5203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792917 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792934 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c-metrics-tls\") pod \"dns-operator-744455d44c-gv8n5\" (UID: \"6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-gv8n5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792951 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-audit\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792968 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/873587ac-5d12-45c4-bfd5-42bc08d29f65-serving-cert\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.792986 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fc2b6738-957d-4636-a723-c11207f29087-audit-dir\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.793006 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/09980e91-b2e4-4a0e-bee7-dc101096f804-images\") pod \"machine-api-operator-5694c8668f-f9dqp\" (UID: \"09980e91-b2e4-4a0e-bee7-dc101096f804\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.793025 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4qxp\" (UniqueName: \"kubernetes.io/projected/8797308e-2c6a-45d4-a7e8-dae633de1103-kube-api-access-z4qxp\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.793041 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/652260eb-911a-4b92-8cea-73c9e8f156c4-config\") pod \"route-controller-manager-6576b87f9c-zvvn6\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.793055 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-config\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.793076 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8797308e-2c6a-45d4-a7e8-dae633de1103-serving-cert\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.793095 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbfk5\" (UniqueName: \"kubernetes.io/projected/6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c-kube-api-access-qbfk5\") pod \"dns-operator-744455d44c-gv8n5\" (UID: \"6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-gv8n5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.793111 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.793131 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96330a2a-06f0-4396-bceb-49bf9e3f3407-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4n7dc\" (UID: \"96330a2a-06f0-4396-bceb-49bf9e3f3407\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.793145 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dffjf\" (UniqueName: \"kubernetes.io/projected/96330a2a-06f0-4396-bceb-49bf9e3f3407-kube-api-access-dffjf\") pod \"openshift-apiserver-operator-796bbdcf4f-4n7dc\" (UID: \"96330a2a-06f0-4396-bceb-49bf9e3f3407\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.799372 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f9dqp"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.816370 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.818187 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.818295 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-trhms"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.818331 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xg8s7"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.819076 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9ctf8"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.822544 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jgzzn"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.824989 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jsnt5"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.825838 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-x5tn8"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.826564 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wpkbq"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.827528 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.828298 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.829073 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8jv2k"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.829691 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8jv2k" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.829898 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gv8n5"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.830758 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.831795 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.831863 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.832773 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.833264 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.834094 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.834944 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rwbhx"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.835780 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.836627 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-g9rtk"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.837475 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.838402 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qkgqz"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.839239 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.840106 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.840985 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.841898 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.842883 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.843552 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-g4pv8"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.844483 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.844878 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.845452 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.846344 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.847237 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k9wmb"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.848083 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g4pv8"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.849533 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.850358 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.851200 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pjh6b"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.853723 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-chxbx"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.869040 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.878202 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wq549"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.878789 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wq549" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.879284 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pmjzq"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.880243 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.882955 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wq549"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.885611 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pmjzq"] Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.889043 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893722 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd86a99b-7671-4dd9-88b9-334db1906b6b-serving-cert\") pod \"openshift-config-operator-7777fb866f-trhms\" (UID: \"fd86a99b-7671-4dd9-88b9-334db1906b6b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893752 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e247f646-534e-4aca-8cd9-edadba75848b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6mjfr\" (UID: \"e247f646-534e-4aca-8cd9-edadba75848b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893782 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-metrics-certs\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893802 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/388fd604-2764-4184-b305-ce8f8e31ffa1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2fqlt\" (UID: \"388fd604-2764-4184-b305-ce8f8e31ffa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893822 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/652260eb-911a-4b92-8cea-73c9e8f156c4-serving-cert\") pod \"route-controller-manager-6576b87f9c-zvvn6\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893848 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95f8s\" (UniqueName: \"kubernetes.io/projected/fc2b6738-957d-4636-a723-c11207f29087-kube-api-access-95f8s\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893869 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b8a99cf8-64bd-4e83-b77e-c4358154f10e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wqr7v\" (UID: \"b8a99cf8-64bd-4e83-b77e-c4358154f10e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893895 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81534f91-740b-487b-9149-e8565ccf9905-serving-cert\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893911 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fc2b6738-957d-4636-a723-c11207f29087-audit-dir\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893926 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/09980e91-b2e4-4a0e-bee7-dc101096f804-images\") pod \"machine-api-operator-5694c8668f-f9dqp\" (UID: \"09980e91-b2e4-4a0e-bee7-dc101096f804\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893942 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4qxp\" (UniqueName: \"kubernetes.io/projected/8797308e-2c6a-45d4-a7e8-dae633de1103-kube-api-access-z4qxp\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893957 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/652260eb-911a-4b92-8cea-73c9e8f156c4-config\") pod \"route-controller-manager-6576b87f9c-zvvn6\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893974 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-config\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.893990 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8797308e-2c6a-45d4-a7e8-dae633de1103-serving-cert\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894007 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dffjf\" (UniqueName: \"kubernetes.io/projected/96330a2a-06f0-4396-bceb-49bf9e3f3407-kube-api-access-dffjf\") pod \"openshift-apiserver-operator-796bbdcf4f-4n7dc\" (UID: \"96330a2a-06f0-4396-bceb-49bf9e3f3407\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894024 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894043 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96330a2a-06f0-4396-bceb-49bf9e3f3407-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4n7dc\" (UID: \"96330a2a-06f0-4396-bceb-49bf9e3f3407\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894060 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b8a99cf8-64bd-4e83-b77e-c4358154f10e-images\") pod \"machine-config-operator-74547568cd-wqr7v\" (UID: \"b8a99cf8-64bd-4e83-b77e-c4358154f10e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894077 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/969b3fd9-7b46-4627-bda2-e2fb4a4f5203-config\") pod \"kube-controller-manager-operator-78b949d7b-zqmhf\" (UID: \"969b3fd9-7b46-4627-bda2-e2fb4a4f5203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894091 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/873587ac-5d12-45c4-bfd5-42bc08d29f65-node-pullsecrets\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894092 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fc2b6738-957d-4636-a723-c11207f29087-audit-dir\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894105 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/911e02cc-6184-4c63-a36c-a51b54d0e7bf-serving-cert\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894157 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-node-bootstrap-token\") pod \"machine-config-server-8jv2k\" (UID: \"5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791\") " pod="openshift-machine-config-operator/machine-config-server-8jv2k" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894184 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cce54ab2-f931-4499-a0c1-8a2b2674aeb2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4qlf7\" (UID: \"cce54ab2-f931-4499-a0c1-8a2b2674aeb2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894206 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-client-ca\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894227 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wlh2\" (UniqueName: \"kubernetes.io/projected/d1bc9264-d190-49c0-9122-7cc123dd88e5-kube-api-access-8wlh2\") pod \"dns-default-g4pv8\" (UID: \"d1bc9264-d190-49c0-9122-7cc123dd88e5\") " pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894243 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1bc9264-d190-49c0-9122-7cc123dd88e5-metrics-tls\") pod \"dns-default-g4pv8\" (UID: \"d1bc9264-d190-49c0-9122-7cc123dd88e5\") " pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894276 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dzrs\" (UniqueName: \"kubernetes.io/projected/911e02cc-6184-4c63-a36c-a51b54d0e7bf-kube-api-access-2dzrs\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894306 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbpfn\" (UniqueName: \"kubernetes.io/projected/12b5360d-755a-4cb5-9ef3-0c00550e3913-kube-api-access-lbpfn\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894322 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1bc9264-d190-49c0-9122-7cc123dd88e5-config-volume\") pod \"dns-default-g4pv8\" (UID: \"d1bc9264-d190-49c0-9122-7cc123dd88e5\") " pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894341 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/911e02cc-6184-4c63-a36c-a51b54d0e7bf-etcd-ca\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894910 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.894973 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-service-ca-bundle\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.895023 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx8hm\" (UniqueName: \"kubernetes.io/projected/6c685741-a472-49fe-8237-520bea0232ef-kube-api-access-fx8hm\") pod \"downloads-7954f5f757-xg8s7\" (UID: \"6c685741-a472-49fe-8237-520bea0232ef\") " pod="openshift-console/downloads-7954f5f757-xg8s7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.895057 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.895087 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/837571cc-8fd2-4102-9997-4520ddf6da08-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gzzjt\" (UID: \"837571cc-8fd2-4102-9997-4520ddf6da08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.895114 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-config\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.895134 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rdsw\" (UniqueName: \"kubernetes.io/projected/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-kube-api-access-7rdsw\") pod \"machine-config-server-8jv2k\" (UID: \"5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791\") " pod="openshift-machine-config-operator/machine-config-server-8jv2k" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.895152 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/09980e91-b2e4-4a0e-bee7-dc101096f804-images\") pod \"machine-api-operator-5694c8668f-f9dqp\" (UID: \"09980e91-b2e4-4a0e-bee7-dc101096f804\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.895162 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-etcd-serving-ca\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.895496 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-config\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.895699 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-etcd-serving-ca\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.896378 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96330a2a-06f0-4396-bceb-49bf9e3f3407-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4n7dc\" (UID: \"96330a2a-06f0-4396-bceb-49bf9e3f3407\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.896658 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-client-ca\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.896835 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.896903 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/873587ac-5d12-45c4-bfd5-42bc08d29f65-node-pullsecrets\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897109 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af45636-74c3-4a18-8297-ba3fdde46ace-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9n22d\" (UID: \"7af45636-74c3-4a18-8297-ba3fdde46ace\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897165 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8797308e-2c6a-45d4-a7e8-dae633de1103-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897198 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57xjr\" (UniqueName: \"kubernetes.io/projected/8b141ba2-87a7-4975-a1ed-054769e567bd-kube-api-access-57xjr\") pod \"machine-approver-56656f9798-8t4sg\" (UID: \"8b141ba2-87a7-4975-a1ed-054769e567bd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897224 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897236 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8797308e-2c6a-45d4-a7e8-dae633de1103-config\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897270 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-config\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897277 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-serving-cert\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897343 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-oauth-config\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897389 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3-trusted-ca\") pod \"console-operator-58897d9998-wpkbq\" (UID: \"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3\") " pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897410 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/652260eb-911a-4b92-8cea-73c9e8f156c4-config\") pod \"route-controller-manager-6576b87f9c-zvvn6\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897438 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/911e02cc-6184-4c63-a36c-a51b54d0e7bf-etcd-ca\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897442 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7z8l\" (UniqueName: \"kubernetes.io/projected/05218cdc-d6bb-47ef-aab3-9a5e0dab68b8-kube-api-access-k7z8l\") pod \"multus-admission-controller-857f4d67dd-pjh6b\" (UID: \"05218cdc-d6bb-47ef-aab3-9a5e0dab68b8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pjh6b" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897494 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3-serving-cert\") pod \"console-operator-58897d9998-wpkbq\" (UID: \"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3\") " pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897521 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fd86a99b-7671-4dd9-88b9-334db1906b6b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-trhms\" (UID: \"fd86a99b-7671-4dd9-88b9-334db1906b6b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897544 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/911e02cc-6184-4c63-a36c-a51b54d0e7bf-etcd-client\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897565 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09980e91-b2e4-4a0e-bee7-dc101096f804-config\") pod \"machine-api-operator-5694c8668f-f9dqp\" (UID: \"09980e91-b2e4-4a0e-bee7-dc101096f804\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897583 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/388fd604-2764-4184-b305-ce8f8e31ffa1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2fqlt\" (UID: \"388fd604-2764-4184-b305-ce8f8e31ffa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897605 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897625 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897645 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/864d1c0b-3c85-4472-9d16-c8d5c574e02a-serving-cert\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897665 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/873587ac-5d12-45c4-bfd5-42bc08d29f65-audit-dir\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897684 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/09980e91-b2e4-4a0e-bee7-dc101096f804-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f9dqp\" (UID: \"09980e91-b2e4-4a0e-bee7-dc101096f804\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897705 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96330a2a-06f0-4396-bceb-49bf9e3f3407-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4n7dc\" (UID: \"96330a2a-06f0-4396-bceb-49bf9e3f3407\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897728 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-w5nnp\" (UID: \"724c0c5a-3d4d-41ef-9759-f9a1b62c99fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897753 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81534f91-740b-487b-9149-e8565ccf9905-audit-dir\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897772 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcrl6\" (UniqueName: \"kubernetes.io/projected/837571cc-8fd2-4102-9997-4520ddf6da08-kube-api-access-tcrl6\") pod \"ingress-operator-5b745b69d9-gzzjt\" (UID: \"837571cc-8fd2-4102-9997-4520ddf6da08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897795 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/81534f91-740b-487b-9149-e8565ccf9905-audit-policies\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897816 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cce54ab2-f931-4499-a0c1-8a2b2674aeb2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4qlf7\" (UID: \"cce54ab2-f931-4499-a0c1-8a2b2674aeb2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897842 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/873587ac-5d12-45c4-bfd5-42bc08d29f65-etcd-client\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897865 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4zxg\" (UniqueName: \"kubernetes.io/projected/7af45636-74c3-4a18-8297-ba3fdde46ace-kube-api-access-k4zxg\") pod \"kube-storage-version-migrator-operator-b67b599dd-9n22d\" (UID: \"7af45636-74c3-4a18-8297-ba3fdde46ace\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897898 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-default-certificate\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897950 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8xmm\" (UniqueName: \"kubernetes.io/projected/81534f91-740b-487b-9149-e8565ccf9905-kube-api-access-f8xmm\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897976 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.897998 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-certs\") pod \"machine-config-server-8jv2k\" (UID: \"5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791\") " pod="openshift-machine-config-operator/machine-config-server-8jv2k" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898040 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cce54ab2-f931-4499-a0c1-8a2b2674aeb2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4qlf7\" (UID: \"cce54ab2-f931-4499-a0c1-8a2b2674aeb2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898059 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/969b3fd9-7b46-4627-bda2-e2fb4a4f5203-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zqmhf\" (UID: \"969b3fd9-7b46-4627-bda2-e2fb4a4f5203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898061 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/873587ac-5d12-45c4-bfd5-42bc08d29f65-audit-dir\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898081 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c-metrics-tls\") pod \"dns-operator-744455d44c-gv8n5\" (UID: \"6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-gv8n5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898126 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-audit\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898144 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/873587ac-5d12-45c4-bfd5-42bc08d29f65-serving-cert\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898180 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898204 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8a99cf8-64bd-4e83-b77e-c4358154f10e-proxy-tls\") pod \"machine-config-operator-74547568cd-wqr7v\" (UID: \"b8a99cf8-64bd-4e83-b77e-c4358154f10e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898281 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m65qz\" (UniqueName: \"kubernetes.io/projected/b5059cab-0e22-479f-9079-b031f405e547-kube-api-access-m65qz\") pod \"collect-profiles-29406570-47pck\" (UID: \"b5059cab-0e22-479f-9079-b031f405e547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898322 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbfk5\" (UniqueName: \"kubernetes.io/projected/6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c-kube-api-access-qbfk5\") pod \"dns-operator-744455d44c-gv8n5\" (UID: \"6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-gv8n5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898363 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-image-import-ca\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898381 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn82r\" (UniqueName: \"kubernetes.io/projected/fd86a99b-7671-4dd9-88b9-334db1906b6b-kube-api-access-wn82r\") pod \"openshift-config-operator-7777fb866f-trhms\" (UID: \"fd86a99b-7671-4dd9-88b9-334db1906b6b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898404 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8b141ba2-87a7-4975-a1ed-054769e567bd-machine-approver-tls\") pod \"machine-approver-56656f9798-8t4sg\" (UID: \"8b141ba2-87a7-4975-a1ed-054769e567bd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898445 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad8c6194-f489-42f3-afb2-707a97b490ad-proxy-tls\") pod \"machine-config-controller-84d6567774-p9km8\" (UID: \"ad8c6194-f489-42f3-afb2-707a97b490ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898467 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-config\") pod \"kube-apiserver-operator-766d6c64bb-w5nnp\" (UID: \"724c0c5a-3d4d-41ef-9759-f9a1b62c99fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898468 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8797308e-2c6a-45d4-a7e8-dae633de1103-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898487 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g786\" (UniqueName: \"kubernetes.io/projected/cce54ab2-f931-4499-a0c1-8a2b2674aeb2-kube-api-access-4g786\") pod \"cluster-image-registry-operator-dc59b4c8b-4qlf7\" (UID: \"cce54ab2-f931-4499-a0c1-8a2b2674aeb2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898524 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/81534f91-740b-487b-9149-e8565ccf9905-etcd-client\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898547 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ad8c6194-f489-42f3-afb2-707a97b490ad-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-p9km8\" (UID: \"ad8c6194-f489-42f3-afb2-707a97b490ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898617 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3-config\") pod \"console-operator-58897d9998-wpkbq\" (UID: \"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3\") " pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898643 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af45636-74c3-4a18-8297-ba3fdde46ace-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9n22d\" (UID: \"7af45636-74c3-4a18-8297-ba3fdde46ace\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898687 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-audit-policies\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898709 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898730 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-khv8v\" (UID: \"637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898749 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thlm7\" (UniqueName: \"kubernetes.io/projected/93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3-kube-api-access-thlm7\") pod \"console-operator-58897d9998-wpkbq\" (UID: \"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3\") " pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898766 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsbl2\" (UniqueName: \"kubernetes.io/projected/b8a99cf8-64bd-4e83-b77e-c4358154f10e-kube-api-access-jsbl2\") pod \"machine-config-operator-74547568cd-wqr7v\" (UID: \"b8a99cf8-64bd-4e83-b77e-c4358154f10e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898788 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-trusted-ca-bundle\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898810 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mvjv\" (UniqueName: \"kubernetes.io/projected/652260eb-911a-4b92-8cea-73c9e8f156c4-kube-api-access-7mvjv\") pod \"route-controller-manager-6576b87f9c-zvvn6\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898832 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxbv4\" (UniqueName: \"kubernetes.io/projected/864d1c0b-3c85-4472-9d16-c8d5c574e02a-kube-api-access-zxbv4\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898854 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8b141ba2-87a7-4975-a1ed-054769e567bd-auth-proxy-config\") pod \"machine-approver-56656f9798-8t4sg\" (UID: \"8b141ba2-87a7-4975-a1ed-054769e567bd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898879 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8797308e-2c6a-45d4-a7e8-dae633de1103-service-ca-bundle\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898901 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898922 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/652260eb-911a-4b92-8cea-73c9e8f156c4-client-ca\") pod \"route-controller-manager-6576b87f9c-zvvn6\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898941 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e247f646-534e-4aca-8cd9-edadba75848b-srv-cert\") pod \"olm-operator-6b444d44fb-6mjfr\" (UID: \"e247f646-534e-4aca-8cd9-edadba75848b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.898959 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/837571cc-8fd2-4102-9997-4520ddf6da08-metrics-tls\") pod \"ingress-operator-5b745b69d9-gzzjt\" (UID: \"837571cc-8fd2-4102-9997-4520ddf6da08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899043 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/05218cdc-d6bb-47ef-aab3-9a5e0dab68b8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pjh6b\" (UID: \"05218cdc-d6bb-47ef-aab3-9a5e0dab68b8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pjh6b" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899062 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-stats-auth\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899086 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899118 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-config\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899139 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899161 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nftmj\" (UniqueName: \"kubernetes.io/projected/637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5-kube-api-access-nftmj\") pod \"cluster-samples-operator-665b6dd947-khv8v\" (UID: \"637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899179 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm7xc\" (UniqueName: \"kubernetes.io/projected/1fea2129-7ad0-45d8-9447-315107ef1c0c-kube-api-access-cm7xc\") pod \"control-plane-machine-set-operator-78cbb6b69f-gspdt\" (UID: \"1fea2129-7ad0-45d8-9447-315107ef1c0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899199 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vgv6\" (UniqueName: \"kubernetes.io/projected/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-kube-api-access-7vgv6\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899222 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-oauth-serving-cert\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899240 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-w5nnp\" (UID: \"724c0c5a-3d4d-41ef-9759-f9a1b62c99fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899279 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k444s\" (UniqueName: \"kubernetes.io/projected/873587ac-5d12-45c4-bfd5-42bc08d29f65-kube-api-access-k444s\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899300 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knb66\" (UniqueName: \"kubernetes.io/projected/e247f646-534e-4aca-8cd9-edadba75848b-kube-api-access-knb66\") pod \"olm-operator-6b444d44fb-6mjfr\" (UID: \"e247f646-534e-4aca-8cd9-edadba75848b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899320 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b141ba2-87a7-4975-a1ed-054769e567bd-config\") pod \"machine-approver-56656f9798-8t4sg\" (UID: \"8b141ba2-87a7-4975-a1ed-054769e567bd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899342 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899345 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09980e91-b2e4-4a0e-bee7-dc101096f804-config\") pod \"machine-api-operator-5694c8668f-f9dqp\" (UID: \"09980e91-b2e4-4a0e-bee7-dc101096f804\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899362 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/873587ac-5d12-45c4-bfd5-42bc08d29f65-encryption-config\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899390 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fd86a99b-7671-4dd9-88b9-334db1906b6b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-trhms\" (UID: \"fd86a99b-7671-4dd9-88b9-334db1906b6b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899425 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/911e02cc-6184-4c63-a36c-a51b54d0e7bf-etcd-service-ca\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899450 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpjr7\" (UniqueName: \"kubernetes.io/projected/ad8c6194-f489-42f3-afb2-707a97b490ad-kube-api-access-vpjr7\") pod \"machine-config-controller-84d6567774-p9km8\" (UID: \"ad8c6194-f489-42f3-afb2-707a97b490ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899472 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81534f91-740b-487b-9149-e8565ccf9905-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899495 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/837571cc-8fd2-4102-9997-4520ddf6da08-trusted-ca\") pod \"ingress-operator-5b745b69d9-gzzjt\" (UID: \"837571cc-8fd2-4102-9997-4520ddf6da08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899516 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5059cab-0e22-479f-9079-b031f405e547-config-volume\") pod \"collect-profiles-29406570-47pck\" (UID: \"b5059cab-0e22-479f-9079-b031f405e547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899533 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/81534f91-740b-487b-9149-e8565ccf9905-encryption-config\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899552 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/388fd604-2764-4184-b305-ce8f8e31ffa1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2fqlt\" (UID: \"388fd604-2764-4184-b305-ce8f8e31ffa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899574 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/81534f91-740b-487b-9149-e8565ccf9905-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899686 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5059cab-0e22-479f-9079-b031f405e547-secret-volume\") pod \"collect-profiles-29406570-47pck\" (UID: \"b5059cab-0e22-479f-9079-b031f405e547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899706 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-service-ca\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899728 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmkkq\" (UniqueName: \"kubernetes.io/projected/09980e91-b2e4-4a0e-bee7-dc101096f804-kube-api-access-dmkkq\") pod \"machine-api-operator-5694c8668f-f9dqp\" (UID: \"09980e91-b2e4-4a0e-bee7-dc101096f804\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899753 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fea2129-7ad0-45d8-9447-315107ef1c0c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gspdt\" (UID: \"1fea2129-7ad0-45d8-9447-315107ef1c0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899775 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899797 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/969b3fd9-7b46-4627-bda2-e2fb4a4f5203-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zqmhf\" (UID: \"969b3fd9-7b46-4627-bda2-e2fb4a4f5203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899817 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911e02cc-6184-4c63-a36c-a51b54d0e7bf-config\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.899943 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.900602 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/911e02cc-6184-4c63-a36c-a51b54d0e7bf-serving-cert\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.901659 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-oauth-config\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.902228 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd86a99b-7671-4dd9-88b9-334db1906b6b-serving-cert\") pod \"openshift-config-operator-7777fb866f-trhms\" (UID: \"fd86a99b-7671-4dd9-88b9-334db1906b6b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.903076 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81534f91-740b-487b-9149-e8565ccf9905-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.903900 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8b141ba2-87a7-4975-a1ed-054769e567bd-auth-proxy-config\") pod \"machine-approver-56656f9798-8t4sg\" (UID: \"8b141ba2-87a7-4975-a1ed-054769e567bd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.904033 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/911e02cc-6184-4c63-a36c-a51b54d0e7bf-etcd-service-ca\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.904413 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8b141ba2-87a7-4975-a1ed-054769e567bd-machine-approver-tls\") pod \"machine-approver-56656f9798-8t4sg\" (UID: \"8b141ba2-87a7-4975-a1ed-054769e567bd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.904448 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/911e02cc-6184-4c63-a36c-a51b54d0e7bf-etcd-client\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.904667 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911e02cc-6184-4c63-a36c-a51b54d0e7bf-config\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.905058 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-service-ca\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.905082 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/81534f91-740b-487b-9149-e8565ccf9905-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.905114 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/09980e91-b2e4-4a0e-bee7-dc101096f804-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f9dqp\" (UID: \"09980e91-b2e4-4a0e-bee7-dc101096f804\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.905165 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-trusted-ca-bundle\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.905434 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.905622 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8797308e-2c6a-45d4-a7e8-dae633de1103-service-ca-bundle\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.906392 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8797308e-2c6a-45d4-a7e8-dae633de1103-config\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.906640 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/652260eb-911a-4b92-8cea-73c9e8f156c4-client-ca\") pod \"route-controller-manager-6576b87f9c-zvvn6\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.906665 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/873587ac-5d12-45c4-bfd5-42bc08d29f65-encryption-config\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.906683 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81534f91-740b-487b-9149-e8565ccf9905-audit-dir\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.906924 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ad8c6194-f489-42f3-afb2-707a97b490ad-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-p9km8\" (UID: \"ad8c6194-f489-42f3-afb2-707a97b490ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.907363 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b141ba2-87a7-4975-a1ed-054769e567bd-config\") pod \"machine-approver-56656f9798-8t4sg\" (UID: \"8b141ba2-87a7-4975-a1ed-054769e567bd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.907699 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.908280 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.908368 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cce54ab2-f931-4499-a0c1-8a2b2674aeb2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4qlf7\" (UID: \"cce54ab2-f931-4499-a0c1-8a2b2674aeb2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.908441 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/81534f91-740b-487b-9149-e8565ccf9905-audit-policies\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.908583 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-audit\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.908777 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8797308e-2c6a-45d4-a7e8-dae633de1103-serving-cert\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.908650 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-oauth-serving-cert\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.908651 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.908934 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96330a2a-06f0-4396-bceb-49bf9e3f3407-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4n7dc\" (UID: \"96330a2a-06f0-4396-bceb-49bf9e3f3407\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.909240 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81534f91-740b-487b-9149-e8565ccf9905-serving-cert\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.909320 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c-metrics-tls\") pod \"dns-operator-744455d44c-gv8n5\" (UID: \"6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-gv8n5" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.909545 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/864d1c0b-3c85-4472-9d16-c8d5c574e02a-serving-cert\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.909796 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-config\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.910043 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/81534f91-740b-487b-9149-e8565ccf9905-encryption-config\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.910244 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-audit-policies\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.910309 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.910531 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.910704 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/652260eb-911a-4b92-8cea-73c9e8f156c4-serving-cert\") pod \"route-controller-manager-6576b87f9c-zvvn6\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.910798 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cce54ab2-f931-4499-a0c1-8a2b2674aeb2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4qlf7\" (UID: \"cce54ab2-f931-4499-a0c1-8a2b2674aeb2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.911216 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/81534f91-740b-487b-9149-e8565ccf9905-etcd-client\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.911416 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-serving-cert\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.911910 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-khv8v\" (UID: \"637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.911957 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.913072 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/873587ac-5d12-45c4-bfd5-42bc08d29f65-image-import-ca\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.913304 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/873587ac-5d12-45c4-bfd5-42bc08d29f65-serving-cert\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.913808 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/969b3fd9-7b46-4627-bda2-e2fb4a4f5203-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zqmhf\" (UID: \"969b3fd9-7b46-4627-bda2-e2fb4a4f5203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.914003 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.914376 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.914599 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.916095 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/873587ac-5d12-45c4-bfd5-42bc08d29f65-etcd-client\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.917958 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/969b3fd9-7b46-4627-bda2-e2fb4a4f5203-config\") pod \"kube-controller-manager-operator-78b949d7b-zqmhf\" (UID: \"969b3fd9-7b46-4627-bda2-e2fb4a4f5203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.918181 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.929391 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.936951 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3-config\") pod \"console-operator-58897d9998-wpkbq\" (UID: \"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3\") " pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.948904 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.969960 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 29 05:30:19 crc kubenswrapper[4594]: I1129 05:30:19.995085 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:19.999974 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3-trusted-ca\") pod \"console-operator-58897d9998-wpkbq\" (UID: \"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3\") " pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001004 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5059cab-0e22-479f-9079-b031f405e547-config-volume\") pod \"collect-profiles-29406570-47pck\" (UID: \"b5059cab-0e22-479f-9079-b031f405e547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001044 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5059cab-0e22-479f-9079-b031f405e547-secret-volume\") pod \"collect-profiles-29406570-47pck\" (UID: \"b5059cab-0e22-479f-9079-b031f405e547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001096 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e247f646-534e-4aca-8cd9-edadba75848b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6mjfr\" (UID: \"e247f646-534e-4aca-8cd9-edadba75848b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001119 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-metrics-certs\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001147 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b8a99cf8-64bd-4e83-b77e-c4358154f10e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wqr7v\" (UID: \"b8a99cf8-64bd-4e83-b77e-c4358154f10e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001185 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b8a99cf8-64bd-4e83-b77e-c4358154f10e-images\") pod \"machine-config-operator-74547568cd-wqr7v\" (UID: \"b8a99cf8-64bd-4e83-b77e-c4358154f10e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001204 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-node-bootstrap-token\") pod \"machine-config-server-8jv2k\" (UID: \"5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791\") " pod="openshift-machine-config-operator/machine-config-server-8jv2k" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001231 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wlh2\" (UniqueName: \"kubernetes.io/projected/d1bc9264-d190-49c0-9122-7cc123dd88e5-kube-api-access-8wlh2\") pod \"dns-default-g4pv8\" (UID: \"d1bc9264-d190-49c0-9122-7cc123dd88e5\") " pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001265 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1bc9264-d190-49c0-9122-7cc123dd88e5-metrics-tls\") pod \"dns-default-g4pv8\" (UID: \"d1bc9264-d190-49c0-9122-7cc123dd88e5\") " pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001324 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1bc9264-d190-49c0-9122-7cc123dd88e5-config-volume\") pod \"dns-default-g4pv8\" (UID: \"d1bc9264-d190-49c0-9122-7cc123dd88e5\") " pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001365 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-service-ca-bundle\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001400 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rdsw\" (UniqueName: \"kubernetes.io/projected/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-kube-api-access-7rdsw\") pod \"machine-config-server-8jv2k\" (UID: \"5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791\") " pod="openshift-machine-config-operator/machine-config-server-8jv2k" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001528 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af45636-74c3-4a18-8297-ba3fdde46ace-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9n22d\" (UID: \"7af45636-74c3-4a18-8297-ba3fdde46ace\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001590 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7z8l\" (UniqueName: \"kubernetes.io/projected/05218cdc-d6bb-47ef-aab3-9a5e0dab68b8-kube-api-access-k7z8l\") pod \"multus-admission-controller-857f4d67dd-pjh6b\" (UID: \"05218cdc-d6bb-47ef-aab3-9a5e0dab68b8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pjh6b" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001650 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-w5nnp\" (UID: \"724c0c5a-3d4d-41ef-9759-f9a1b62c99fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.001903 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b8a99cf8-64bd-4e83-b77e-c4358154f10e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wqr7v\" (UID: \"b8a99cf8-64bd-4e83-b77e-c4358154f10e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002114 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4zxg\" (UniqueName: \"kubernetes.io/projected/7af45636-74c3-4a18-8297-ba3fdde46ace-kube-api-access-k4zxg\") pod \"kube-storage-version-migrator-operator-b67b599dd-9n22d\" (UID: \"7af45636-74c3-4a18-8297-ba3fdde46ace\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002175 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-default-certificate\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002311 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-certs\") pod \"machine-config-server-8jv2k\" (UID: \"5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791\") " pod="openshift-machine-config-operator/machine-config-server-8jv2k" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002345 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8a99cf8-64bd-4e83-b77e-c4358154f10e-proxy-tls\") pod \"machine-config-operator-74547568cd-wqr7v\" (UID: \"b8a99cf8-64bd-4e83-b77e-c4358154f10e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002366 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m65qz\" (UniqueName: \"kubernetes.io/projected/b5059cab-0e22-479f-9079-b031f405e547-kube-api-access-m65qz\") pod \"collect-profiles-29406570-47pck\" (UID: \"b5059cab-0e22-479f-9079-b031f405e547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002403 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-config\") pod \"kube-apiserver-operator-766d6c64bb-w5nnp\" (UID: \"724c0c5a-3d4d-41ef-9759-f9a1b62c99fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002429 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af45636-74c3-4a18-8297-ba3fdde46ace-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9n22d\" (UID: \"7af45636-74c3-4a18-8297-ba3fdde46ace\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002456 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsbl2\" (UniqueName: \"kubernetes.io/projected/b8a99cf8-64bd-4e83-b77e-c4358154f10e-kube-api-access-jsbl2\") pod \"machine-config-operator-74547568cd-wqr7v\" (UID: \"b8a99cf8-64bd-4e83-b77e-c4358154f10e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002585 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e247f646-534e-4aca-8cd9-edadba75848b-srv-cert\") pod \"olm-operator-6b444d44fb-6mjfr\" (UID: \"e247f646-534e-4aca-8cd9-edadba75848b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002616 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/05218cdc-d6bb-47ef-aab3-9a5e0dab68b8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pjh6b\" (UID: \"05218cdc-d6bb-47ef-aab3-9a5e0dab68b8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pjh6b" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002652 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-stats-auth\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002715 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vgv6\" (UniqueName: \"kubernetes.io/projected/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-kube-api-access-7vgv6\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002735 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-w5nnp\" (UID: \"724c0c5a-3d4d-41ef-9759-f9a1b62c99fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.002853 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knb66\" (UniqueName: \"kubernetes.io/projected/e247f646-534e-4aca-8cd9-edadba75848b-kube-api-access-knb66\") pod \"olm-operator-6b444d44fb-6mjfr\" (UID: \"e247f646-534e-4aca-8cd9-edadba75848b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.009005 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.021173 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3-serving-cert\") pod \"console-operator-58897d9998-wpkbq\" (UID: \"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3\") " pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.029414 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.069913 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.089337 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.097413 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/388fd604-2764-4184-b305-ce8f8e31ffa1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2fqlt\" (UID: \"388fd604-2764-4184-b305-ce8f8e31ffa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.110338 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.129943 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.149539 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.159643 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/388fd604-2764-4184-b305-ce8f8e31ffa1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2fqlt\" (UID: \"388fd604-2764-4184-b305-ce8f8e31ffa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.169019 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.173013 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/837571cc-8fd2-4102-9997-4520ddf6da08-metrics-tls\") pod \"ingress-operator-5b745b69d9-gzzjt\" (UID: \"837571cc-8fd2-4102-9997-4520ddf6da08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.189384 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.196271 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fea2129-7ad0-45d8-9447-315107ef1c0c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gspdt\" (UID: \"1fea2129-7ad0-45d8-9447-315107ef1c0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.214787 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.224543 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/837571cc-8fd2-4102-9997-4520ddf6da08-trusted-ca\") pod \"ingress-operator-5b745b69d9-gzzjt\" (UID: \"837571cc-8fd2-4102-9997-4520ddf6da08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.229979 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.249680 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.269312 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.289330 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.309061 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.320161 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad8c6194-f489-42f3-afb2-707a97b490ad-proxy-tls\") pod \"machine-config-controller-84d6567774-p9km8\" (UID: \"ad8c6194-f489-42f3-afb2-707a97b490ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.329324 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.349473 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.369864 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.372198 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b8a99cf8-64bd-4e83-b77e-c4358154f10e-images\") pod \"machine-config-operator-74547568cd-wqr7v\" (UID: \"b8a99cf8-64bd-4e83-b77e-c4358154f10e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.388784 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.408893 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.415889 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-default-certificate\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.429007 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.435169 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-stats-auth\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.449244 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.454119 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-metrics-certs\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.469046 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.472510 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-service-ca-bundle\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.489732 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.509356 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.511982 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5059cab-0e22-479f-9079-b031f405e547-config-volume\") pod \"collect-profiles-29406570-47pck\" (UID: \"b5059cab-0e22-479f-9079-b031f405e547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.529383 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.534009 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5059cab-0e22-479f-9079-b031f405e547-secret-volume\") pod \"collect-profiles-29406570-47pck\" (UID: \"b5059cab-0e22-479f-9079-b031f405e547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.534433 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e247f646-534e-4aca-8cd9-edadba75848b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6mjfr\" (UID: \"e247f646-534e-4aca-8cd9-edadba75848b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.550651 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.569165 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.575485 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e247f646-534e-4aca-8cd9-edadba75848b-srv-cert\") pod \"olm-operator-6b444d44fb-6mjfr\" (UID: \"e247f646-534e-4aca-8cd9-edadba75848b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.590110 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.612826 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.626496 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8a99cf8-64bd-4e83-b77e-c4358154f10e-proxy-tls\") pod \"machine-config-operator-74547568cd-wqr7v\" (UID: \"b8a99cf8-64bd-4e83-b77e-c4358154f10e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.629764 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.649278 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.668963 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.689029 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.709774 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.729483 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.749736 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.755813 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/05218cdc-d6bb-47ef-aab3-9a5e0dab68b8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pjh6b\" (UID: \"05218cdc-d6bb-47ef-aab3-9a5e0dab68b8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pjh6b" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.768156 4594 request.go:700] Waited for 1.013823952s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-ac-dockercfg-9lkdf&limit=500&resourceVersion=0 Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.769014 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.795928 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.809581 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.829311 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.849876 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.869426 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.889346 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.909731 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.929347 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.949620 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.956298 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af45636-74c3-4a18-8297-ba3fdde46ace-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9n22d\" (UID: \"7af45636-74c3-4a18-8297-ba3fdde46ace\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.969499 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.974186 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af45636-74c3-4a18-8297-ba3fdde46ace-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9n22d\" (UID: \"7af45636-74c3-4a18-8297-ba3fdde46ace\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" Nov 29 05:30:20 crc kubenswrapper[4594]: I1129 05:30:20.989428 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 29 05:30:21 crc kubenswrapper[4594]: E1129 05:30:21.001705 4594 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Nov 29 05:30:21 crc kubenswrapper[4594]: E1129 05:30:21.001722 4594 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Nov 29 05:30:21 crc kubenswrapper[4594]: E1129 05:30:21.001726 4594 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Nov 29 05:30:21 crc kubenswrapper[4594]: E1129 05:30:21.001780 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d1bc9264-d190-49c0-9122-7cc123dd88e5-config-volume podName:d1bc9264-d190-49c0-9122-7cc123dd88e5 nodeName:}" failed. No retries permitted until 2025-11-29 05:30:21.501758664 +0000 UTC m=+145.742267884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/d1bc9264-d190-49c0-9122-7cc123dd88e5-config-volume") pod "dns-default-g4pv8" (UID: "d1bc9264-d190-49c0-9122-7cc123dd88e5") : failed to sync configmap cache: timed out waiting for the condition Nov 29 05:30:21 crc kubenswrapper[4594]: E1129 05:30:21.001800 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-node-bootstrap-token podName:5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791 nodeName:}" failed. No retries permitted until 2025-11-29 05:30:21.501793018 +0000 UTC m=+145.742302237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-node-bootstrap-token") pod "machine-config-server-8jv2k" (UID: "5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791") : failed to sync secret cache: timed out waiting for the condition Nov 29 05:30:21 crc kubenswrapper[4594]: E1129 05:30:21.001817 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1bc9264-d190-49c0-9122-7cc123dd88e5-metrics-tls podName:d1bc9264-d190-49c0-9122-7cc123dd88e5 nodeName:}" failed. No retries permitted until 2025-11-29 05:30:21.50181043 +0000 UTC m=+145.742319650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d1bc9264-d190-49c0-9122-7cc123dd88e5-metrics-tls") pod "dns-default-g4pv8" (UID: "d1bc9264-d190-49c0-9122-7cc123dd88e5") : failed to sync secret cache: timed out waiting for the condition Nov 29 05:30:21 crc kubenswrapper[4594]: E1129 05:30:21.002830 4594 secret.go:188] Couldn't get secret openshift-kube-apiserver-operator/kube-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 29 05:30:21 crc kubenswrapper[4594]: E1129 05:30:21.002879 4594 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Nov 29 05:30:21 crc kubenswrapper[4594]: E1129 05:30:21.002893 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-serving-cert podName:724c0c5a-3d4d-41ef-9759-f9a1b62c99fe nodeName:}" failed. No retries permitted until 2025-11-29 05:30:21.50287499 +0000 UTC m=+145.743384220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-serving-cert") pod "kube-apiserver-operator-766d6c64bb-w5nnp" (UID: "724c0c5a-3d4d-41ef-9759-f9a1b62c99fe") : failed to sync secret cache: timed out waiting for the condition Nov 29 05:30:21 crc kubenswrapper[4594]: E1129 05:30:21.002920 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-certs podName:5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791 nodeName:}" failed. No retries permitted until 2025-11-29 05:30:21.502910215 +0000 UTC m=+145.743419436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-certs") pod "machine-config-server-8jv2k" (UID: "5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791") : failed to sync secret cache: timed out waiting for the condition Nov 29 05:30:21 crc kubenswrapper[4594]: E1129 05:30:21.003062 4594 configmap.go:193] Couldn't get configMap openshift-kube-apiserver-operator/kube-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Nov 29 05:30:21 crc kubenswrapper[4594]: E1129 05:30:21.003163 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-config podName:724c0c5a-3d4d-41ef-9759-f9a1b62c99fe nodeName:}" failed. No retries permitted until 2025-11-29 05:30:21.503150836 +0000 UTC m=+145.743660056 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-config") pod "kube-apiserver-operator-766d6c64bb-w5nnp" (UID: "724c0c5a-3d4d-41ef-9759-f9a1b62c99fe") : failed to sync configmap cache: timed out waiting for the condition Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.007779 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.008781 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.028707 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.049734 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.069684 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.089071 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.110141 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.129699 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.149112 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.169501 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.189177 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.209449 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.229804 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.249592 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.269375 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.289556 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.310503 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.329006 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.349866 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.369548 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.389544 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.409821 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.430186 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.449897 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.469886 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.489690 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.509249 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.523841 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-node-bootstrap-token\") pod \"machine-config-server-8jv2k\" (UID: \"5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791\") " pod="openshift-machine-config-operator/machine-config-server-8jv2k" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.523891 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1bc9264-d190-49c0-9122-7cc123dd88e5-metrics-tls\") pod \"dns-default-g4pv8\" (UID: \"d1bc9264-d190-49c0-9122-7cc123dd88e5\") " pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.523924 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1bc9264-d190-49c0-9122-7cc123dd88e5-config-volume\") pod \"dns-default-g4pv8\" (UID: \"d1bc9264-d190-49c0-9122-7cc123dd88e5\") " pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.523998 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-w5nnp\" (UID: \"724c0c5a-3d4d-41ef-9759-f9a1b62c99fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.524044 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-certs\") pod \"machine-config-server-8jv2k\" (UID: \"5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791\") " pod="openshift-machine-config-operator/machine-config-server-8jv2k" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.524089 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-config\") pod \"kube-apiserver-operator-766d6c64bb-w5nnp\" (UID: \"724c0c5a-3d4d-41ef-9759-f9a1b62c99fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.524769 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-config\") pod \"kube-apiserver-operator-766d6c64bb-w5nnp\" (UID: \"724c0c5a-3d4d-41ef-9759-f9a1b62c99fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.528594 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-certs\") pod \"machine-config-server-8jv2k\" (UID: \"5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791\") " pod="openshift-machine-config-operator/machine-config-server-8jv2k" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.528660 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-node-bootstrap-token\") pod \"machine-config-server-8jv2k\" (UID: \"5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791\") " pod="openshift-machine-config-operator/machine-config-server-8jv2k" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.528885 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-w5nnp\" (UID: \"724c0c5a-3d4d-41ef-9759-f9a1b62c99fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.528931 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.535535 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1bc9264-d190-49c0-9122-7cc123dd88e5-config-volume\") pod \"dns-default-g4pv8\" (UID: \"d1bc9264-d190-49c0-9122-7cc123dd88e5\") " pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.549607 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.569823 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.577216 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1bc9264-d190-49c0-9122-7cc123dd88e5-metrics-tls\") pod \"dns-default-g4pv8\" (UID: \"d1bc9264-d190-49c0-9122-7cc123dd88e5\") " pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.610118 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.629826 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.649875 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.669132 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.689138 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.709671 4594 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.729095 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.761606 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95f8s\" (UniqueName: \"kubernetes.io/projected/fc2b6738-957d-4636-a723-c11207f29087-kube-api-access-95f8s\") pod \"oauth-openshift-558db77b4-rwbhx\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.768519 4594 request.go:700] Waited for 1.874332495s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.785285 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dffjf\" (UniqueName: \"kubernetes.io/projected/96330a2a-06f0-4396-bceb-49bf9e3f3407-kube-api-access-dffjf\") pod \"openshift-apiserver-operator-796bbdcf4f-4n7dc\" (UID: \"96330a2a-06f0-4396-bceb-49bf9e3f3407\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.801551 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cce54ab2-f931-4499-a0c1-8a2b2674aeb2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4qlf7\" (UID: \"cce54ab2-f931-4499-a0c1-8a2b2674aeb2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.820454 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx8hm\" (UniqueName: \"kubernetes.io/projected/6c685741-a472-49fe-8237-520bea0232ef-kube-api-access-fx8hm\") pod \"downloads-7954f5f757-xg8s7\" (UID: \"6c685741-a472-49fe-8237-520bea0232ef\") " pod="openshift-console/downloads-7954f5f757-xg8s7" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.840161 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4qxp\" (UniqueName: \"kubernetes.io/projected/8797308e-2c6a-45d4-a7e8-dae633de1103-kube-api-access-z4qxp\") pod \"authentication-operator-69f744f599-x5tn8\" (UID: \"8797308e-2c6a-45d4-a7e8-dae633de1103\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.860377 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/837571cc-8fd2-4102-9997-4520ddf6da08-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gzzjt\" (UID: \"837571cc-8fd2-4102-9997-4520ddf6da08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.876586 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.882830 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbpfn\" (UniqueName: \"kubernetes.io/projected/12b5360d-755a-4cb5-9ef3-0c00550e3913-kube-api-access-lbpfn\") pod \"console-f9d7485db-9ctf8\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.900794 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.903812 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dzrs\" (UniqueName: \"kubernetes.io/projected/911e02cc-6184-4c63-a36c-a51b54d0e7bf-kube-api-access-2dzrs\") pod \"etcd-operator-b45778765-jsnt5\" (UID: \"911e02cc-6184-4c63-a36c-a51b54d0e7bf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.918324 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.924783 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57xjr\" (UniqueName: \"kubernetes.io/projected/8b141ba2-87a7-4975-a1ed-054769e567bd-kube-api-access-57xjr\") pod \"machine-approver-56656f9798-8t4sg\" (UID: \"8b141ba2-87a7-4975-a1ed-054769e567bd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.942578 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn82r\" (UniqueName: \"kubernetes.io/projected/fd86a99b-7671-4dd9-88b9-334db1906b6b-kube-api-access-wn82r\") pod \"openshift-config-operator-7777fb866f-trhms\" (UID: \"fd86a99b-7671-4dd9-88b9-334db1906b6b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.946204 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.946297 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xg8s7" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.963376 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.968518 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.973283 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8xmm\" (UniqueName: \"kubernetes.io/projected/81534f91-740b-487b-9149-e8565ccf9905-kube-api-access-f8xmm\") pod \"apiserver-7bbb656c7d-g9dnn\" (UID: \"81534f91-740b-487b-9149-e8565ccf9905\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:21 crc kubenswrapper[4594]: I1129 05:30:21.983610 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpjr7\" (UniqueName: \"kubernetes.io/projected/ad8c6194-f489-42f3-afb2-707a97b490ad-kube-api-access-vpjr7\") pod \"machine-config-controller-84d6567774-p9km8\" (UID: \"ad8c6194-f489-42f3-afb2-707a97b490ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.005173 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmkkq\" (UniqueName: \"kubernetes.io/projected/09980e91-b2e4-4a0e-bee7-dc101096f804-kube-api-access-dmkkq\") pod \"machine-api-operator-5694c8668f-f9dqp\" (UID: \"09980e91-b2e4-4a0e-bee7-dc101096f804\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.023786 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mvjv\" (UniqueName: \"kubernetes.io/projected/652260eb-911a-4b92-8cea-73c9e8f156c4-kube-api-access-7mvjv\") pod \"route-controller-manager-6576b87f9c-zvvn6\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.030308 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:22 crc kubenswrapper[4594]: E1129 05:30:22.030530 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:32:24.030510517 +0000 UTC m=+268.271019737 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.030641 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.030804 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.032381 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.033501 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.042807 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.043876 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-x5tn8"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.048360 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/969b3fd9-7b46-4627-bda2-e2fb4a4f5203-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zqmhf\" (UID: \"969b3fd9-7b46-4627-bda2-e2fb4a4f5203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.060804 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxbv4\" (UniqueName: \"kubernetes.io/projected/864d1c0b-3c85-4472-9d16-c8d5c574e02a-kube-api-access-zxbv4\") pod \"controller-manager-879f6c89f-scxzk\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.066211 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.084687 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/388fd604-2764-4184-b305-ce8f8e31ffa1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2fqlt\" (UID: \"388fd604-2764-4184-b305-ce8f8e31ffa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.103098 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nftmj\" (UniqueName: \"kubernetes.io/projected/637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5-kube-api-access-nftmj\") pod \"cluster-samples-operator-665b6dd947-khv8v\" (UID: \"637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.103812 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.105598 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.120736 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9ctf8"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.125618 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcrl6\" (UniqueName: \"kubernetes.io/projected/837571cc-8fd2-4102-9997-4520ddf6da08-kube-api-access-tcrl6\") pod \"ingress-operator-5b745b69d9-gzzjt\" (UID: \"837571cc-8fd2-4102-9997-4520ddf6da08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.133631 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.133774 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.137674 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.137690 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.145943 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g786\" (UniqueName: \"kubernetes.io/projected/cce54ab2-f931-4499-a0c1-8a2b2674aeb2-kube-api-access-4g786\") pod \"cluster-image-registry-operator-dc59b4c8b-4qlf7\" (UID: \"cce54ab2-f931-4499-a0c1-8a2b2674aeb2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.151482 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" Nov 29 05:30:22 crc kubenswrapper[4594]: W1129 05:30:22.152362 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b141ba2_87a7_4975_a1ed_054769e567bd.slice/crio-bd1dc0841060e1abf4f3d03f2a2944a23b56371847645149499a10695532b2e2 WatchSource:0}: Error finding container bd1dc0841060e1abf4f3d03f2a2944a23b56371847645149499a10695532b2e2: Status 404 returned error can't find the container with id bd1dc0841060e1abf4f3d03f2a2944a23b56371847645149499a10695532b2e2 Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.154424 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.155885 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-trhms"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.162984 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thlm7\" (UniqueName: \"kubernetes.io/projected/93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3-kube-api-access-thlm7\") pod \"console-operator-58897d9998-wpkbq\" (UID: \"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3\") " pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.182448 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k444s\" (UniqueName: \"kubernetes.io/projected/873587ac-5d12-45c4-bfd5-42bc08d29f65-kube-api-access-k444s\") pod \"apiserver-76f77b778f-jgzzn\" (UID: \"873587ac-5d12-45c4-bfd5-42bc08d29f65\") " pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.206135 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbfk5\" (UniqueName: \"kubernetes.io/projected/6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c-kube-api-access-qbfk5\") pod \"dns-operator-744455d44c-gv8n5\" (UID: \"6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-gv8n5" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.225068 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm7xc\" (UniqueName: \"kubernetes.io/projected/1fea2129-7ad0-45d8-9447-315107ef1c0c-kube-api-access-cm7xc\") pod \"control-plane-machine-set-operator-78cbb6b69f-gspdt\" (UID: \"1fea2129-7ad0-45d8-9447-315107ef1c0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.227989 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.232695 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.234003 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.249397 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wlh2\" (UniqueName: \"kubernetes.io/projected/d1bc9264-d190-49c0-9122-7cc123dd88e5-kube-api-access-8wlh2\") pod \"dns-default-g4pv8\" (UID: \"d1bc9264-d190-49c0-9122-7cc123dd88e5\") " pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.250493 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gv8n5" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.257116 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.265929 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rdsw\" (UniqueName: \"kubernetes.io/projected/5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791-kube-api-access-7rdsw\") pod \"machine-config-server-8jv2k\" (UID: \"5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791\") " pod="openshift-machine-config-operator/machine-config-server-8jv2k" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.277241 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.282241 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7z8l\" (UniqueName: \"kubernetes.io/projected/05218cdc-d6bb-47ef-aab3-9a5e0dab68b8-kube-api-access-k7z8l\") pod \"multus-admission-controller-857f4d67dd-pjh6b\" (UID: \"05218cdc-d6bb-47ef-aab3-9a5e0dab68b8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pjh6b" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.289365 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.294317 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.296621 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.302159 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.303061 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.305922 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4zxg\" (UniqueName: \"kubernetes.io/projected/7af45636-74c3-4a18-8297-ba3fdde46ace-kube-api-access-k4zxg\") pod \"kube-storage-version-migrator-operator-b67b599dd-9n22d\" (UID: \"7af45636-74c3-4a18-8297-ba3fdde46ace\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.309763 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.323933 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m65qz\" (UniqueName: \"kubernetes.io/projected/b5059cab-0e22-479f-9079-b031f405e547-kube-api-access-m65qz\") pod \"collect-profiles-29406570-47pck\" (UID: \"b5059cab-0e22-479f-9079-b031f405e547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.336601 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.345806 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f9dqp"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.346587 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsbl2\" (UniqueName: \"kubernetes.io/projected/b8a99cf8-64bd-4e83-b77e-c4358154f10e-kube-api-access-jsbl2\") pod \"machine-config-operator-74547568cd-wqr7v\" (UID: \"b8a99cf8-64bd-4e83-b77e-c4358154f10e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.350140 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.362567 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-scxzk"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.364699 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.367188 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vgv6\" (UniqueName: \"kubernetes.io/projected/c3e4199d-2fe7-4039-8f59-b8bab2beebd0-kube-api-access-7vgv6\") pod \"router-default-5444994796-8pvtn\" (UID: \"c3e4199d-2fe7-4039-8f59-b8bab2beebd0\") " pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.372524 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.380993 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xg8s7"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.387585 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pjh6b" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.388392 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/724c0c5a-3d4d-41ef-9759-f9a1b62c99fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-w5nnp\" (UID: \"724c0c5a-3d4d-41ef-9759-f9a1b62c99fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" Nov 29 05:30:22 crc kubenswrapper[4594]: W1129 05:30:22.398841 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09980e91_b2e4_4a0e_bee7_dc101096f804.slice/crio-883dc7c3eeb91b731b1bd886440dac02bb7c0806e4d51fb1e90163e96eb68b40 WatchSource:0}: Error finding container 883dc7c3eeb91b731b1bd886440dac02bb7c0806e4d51fb1e90163e96eb68b40: Status 404 returned error can't find the container with id 883dc7c3eeb91b731b1bd886440dac02bb7c0806e4d51fb1e90163e96eb68b40 Nov 29 05:30:22 crc kubenswrapper[4594]: W1129 05:30:22.399492 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod652260eb_911a_4b92_8cea_73c9e8f156c4.slice/crio-1fc0fb7ba1ed2dec40f5f72319fc2364bdf8c835465b1fd239fed6cdeb203b21 WatchSource:0}: Error finding container 1fc0fb7ba1ed2dec40f5f72319fc2364bdf8c835465b1fd239fed6cdeb203b21: Status 404 returned error can't find the container with id 1fc0fb7ba1ed2dec40f5f72319fc2364bdf8c835465b1fd239fed6cdeb203b21 Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.403569 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jsnt5"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.410539 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.413668 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rwbhx"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.416618 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knb66\" (UniqueName: \"kubernetes.io/projected/e247f646-534e-4aca-8cd9-edadba75848b-kube-api-access-knb66\") pod \"olm-operator-6b444d44fb-6mjfr\" (UID: \"e247f646-534e-4aca-8cd9-edadba75848b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.417377 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.437601 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0cb87a03-d210-426d-a903-56fd9b55852b-tmpfs\") pod \"packageserver-d55dfcdfc-95k6p\" (UID: \"0cb87a03-d210-426d-a903-56fd9b55852b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.437633 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-registry-tls\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.437661 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c584b692-db15-4724-a208-491507a8474e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k9wmb\" (UID: \"c584b692-db15-4724-a208-491507a8474e\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.437752 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bf5ed613-ce39-4081-8342-aef24fb3a385-srv-cert\") pod \"catalog-operator-68c6474976-zp2th\" (UID: \"bf5ed613-ce39-4081-8342-aef24fb3a385\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.437797 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t4f5\" (UniqueName: \"kubernetes.io/projected/0cb87a03-d210-426d-a903-56fd9b55852b-kube-api-access-2t4f5\") pod \"packageserver-d55dfcdfc-95k6p\" (UID: \"0cb87a03-d210-426d-a903-56fd9b55852b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.437820 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0cb87a03-d210-426d-a903-56fd9b55852b-apiservice-cert\") pod \"packageserver-d55dfcdfc-95k6p\" (UID: \"0cb87a03-d210-426d-a903-56fd9b55852b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.437902 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh2z7\" (UniqueName: \"kubernetes.io/projected/c584b692-db15-4724-a208-491507a8474e-kube-api-access-mh2z7\") pod \"marketplace-operator-79b997595-k9wmb\" (UID: \"c584b692-db15-4724-a208-491507a8474e\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.437938 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-trusted-ca\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.437953 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bf5ed613-ce39-4081-8342-aef24fb3a385-profile-collector-cert\") pod \"catalog-operator-68c6474976-zp2th\" (UID: \"bf5ed613-ce39-4081-8342-aef24fb3a385\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.437986 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p86wv\" (UniqueName: \"kubernetes.io/projected/d68c506f-d0c1-4889-ab10-b886ef7880a7-kube-api-access-p86wv\") pod \"migrator-59844c95c7-g9rtk\" (UID: \"d68c506f-d0c1-4889-ab10-b886ef7880a7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9rtk" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438021 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzfst\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-kube-api-access-zzfst\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438044 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd9d318d-3895-4fed-aa8b-faa43f64ded2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrhzj\" (UID: \"cd9d318d-3895-4fed-aa8b-faa43f64ded2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438065 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grpmt\" (UniqueName: \"kubernetes.io/projected/bd9200ef-4b0d-4e13-b06e-8f59d7e366ee-kube-api-access-grpmt\") pod \"service-ca-operator-777779d784-qdj7h\" (UID: \"bd9200ef-4b0d-4e13-b06e-8f59d7e366ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438090 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-registry-certificates\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438111 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438127 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzphh\" (UniqueName: \"kubernetes.io/projected/cd9d318d-3895-4fed-aa8b-faa43f64ded2-kube-api-access-lzphh\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrhzj\" (UID: \"cd9d318d-3895-4fed-aa8b-faa43f64ded2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438151 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd9d318d-3895-4fed-aa8b-faa43f64ded2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrhzj\" (UID: \"cd9d318d-3895-4fed-aa8b-faa43f64ded2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438366 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9a409f3-6983-4162-9c32-355020bc1c3a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nhk7j\" (UID: \"f9a409f3-6983-4162-9c32-355020bc1c3a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438388 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxntj\" (UniqueName: \"kubernetes.io/projected/f9a409f3-6983-4162-9c32-355020bc1c3a-kube-api-access-qxntj\") pod \"package-server-manager-789f6589d5-nhk7j\" (UID: \"f9a409f3-6983-4162-9c32-355020bc1c3a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438416 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td7js\" (UniqueName: \"kubernetes.io/projected/bf5ed613-ce39-4081-8342-aef24fb3a385-kube-api-access-td7js\") pod \"catalog-operator-68c6474976-zp2th\" (UID: \"bf5ed613-ce39-4081-8342-aef24fb3a385\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438449 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438508 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nqmn\" (UniqueName: \"kubernetes.io/projected/9e3c263e-0845-4d3e-865f-e778b9ff3e53-kube-api-access-8nqmn\") pod \"service-ca-9c57cc56f-qkgqz\" (UID: \"9e3c263e-0845-4d3e-865f-e778b9ff3e53\") " pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438530 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c584b692-db15-4724-a208-491507a8474e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k9wmb\" (UID: \"c584b692-db15-4724-a208-491507a8474e\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438549 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-bound-sa-token\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438587 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9e3c263e-0845-4d3e-865f-e778b9ff3e53-signing-cabundle\") pod \"service-ca-9c57cc56f-qkgqz\" (UID: \"9e3c263e-0845-4d3e-865f-e778b9ff3e53\") " pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438601 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9200ef-4b0d-4e13-b06e-8f59d7e366ee-serving-cert\") pod \"service-ca-operator-777779d784-qdj7h\" (UID: \"bd9200ef-4b0d-4e13-b06e-8f59d7e366ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438642 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9200ef-4b0d-4e13-b06e-8f59d7e366ee-config\") pod \"service-ca-operator-777779d784-qdj7h\" (UID: \"bd9200ef-4b0d-4e13-b06e-8f59d7e366ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438667 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9e3c263e-0845-4d3e-865f-e778b9ff3e53-signing-key\") pod \"service-ca-9c57cc56f-qkgqz\" (UID: \"9e3c263e-0845-4d3e-865f-e778b9ff3e53\") " pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438708 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0cb87a03-d210-426d-a903-56fd9b55852b-webhook-cert\") pod \"packageserver-d55dfcdfc-95k6p\" (UID: \"0cb87a03-d210-426d-a903-56fd9b55852b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.438740 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: E1129 05:30:22.445590 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:22.945574947 +0000 UTC m=+147.186084167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.450870 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8jv2k" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.451021 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.456182 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:22 crc kubenswrapper[4594]: W1129 05:30:22.468058 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c685741_a472_49fe_8237_520bea0232ef.slice/crio-e2eafa920c003169afb24e67541eb68e59d33eadef7e45e211acf34786a00225 WatchSource:0}: Error finding container e2eafa920c003169afb24e67541eb68e59d33eadef7e45e211acf34786a00225: Status 404 returned error can't find the container with id e2eafa920c003169afb24e67541eb68e59d33eadef7e45e211acf34786a00225 Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.482534 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540125 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540305 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-registry-tls\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540359 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c584b692-db15-4724-a208-491507a8474e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k9wmb\" (UID: \"c584b692-db15-4724-a208-491507a8474e\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540449 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bf5ed613-ce39-4081-8342-aef24fb3a385-srv-cert\") pod \"catalog-operator-68c6474976-zp2th\" (UID: \"bf5ed613-ce39-4081-8342-aef24fb3a385\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540508 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t4f5\" (UniqueName: \"kubernetes.io/projected/0cb87a03-d210-426d-a903-56fd9b55852b-kube-api-access-2t4f5\") pod \"packageserver-d55dfcdfc-95k6p\" (UID: \"0cb87a03-d210-426d-a903-56fd9b55852b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540548 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0cb87a03-d210-426d-a903-56fd9b55852b-apiservice-cert\") pod \"packageserver-d55dfcdfc-95k6p\" (UID: \"0cb87a03-d210-426d-a903-56fd9b55852b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540584 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztcj8\" (UniqueName: \"kubernetes.io/projected/81d6a548-1be4-41a1-8972-ed027e6895aa-kube-api-access-ztcj8\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540631 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh2z7\" (UniqueName: \"kubernetes.io/projected/c584b692-db15-4724-a208-491507a8474e-kube-api-access-mh2z7\") pod \"marketplace-operator-79b997595-k9wmb\" (UID: \"c584b692-db15-4724-a208-491507a8474e\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540679 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-trusted-ca\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540699 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bf5ed613-ce39-4081-8342-aef24fb3a385-profile-collector-cert\") pod \"catalog-operator-68c6474976-zp2th\" (UID: \"bf5ed613-ce39-4081-8342-aef24fb3a385\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540768 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-socket-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540784 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-registration-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540826 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p86wv\" (UniqueName: \"kubernetes.io/projected/d68c506f-d0c1-4889-ab10-b886ef7880a7-kube-api-access-p86wv\") pod \"migrator-59844c95c7-g9rtk\" (UID: \"d68c506f-d0c1-4889-ab10-b886ef7880a7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9rtk" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540861 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzfst\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-kube-api-access-zzfst\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540897 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-plugins-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540913 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd9d318d-3895-4fed-aa8b-faa43f64ded2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrhzj\" (UID: \"cd9d318d-3895-4fed-aa8b-faa43f64ded2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540930 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grpmt\" (UniqueName: \"kubernetes.io/projected/bd9200ef-4b0d-4e13-b06e-8f59d7e366ee-kube-api-access-grpmt\") pod \"service-ca-operator-777779d784-qdj7h\" (UID: \"bd9200ef-4b0d-4e13-b06e-8f59d7e366ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540973 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-registry-certificates\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.540991 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-csi-data-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.541067 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd9d318d-3895-4fed-aa8b-faa43f64ded2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrhzj\" (UID: \"cd9d318d-3895-4fed-aa8b-faa43f64ded2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.541105 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzphh\" (UniqueName: \"kubernetes.io/projected/cd9d318d-3895-4fed-aa8b-faa43f64ded2-kube-api-access-lzphh\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrhzj\" (UID: \"cd9d318d-3895-4fed-aa8b-faa43f64ded2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.541142 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9a409f3-6983-4162-9c32-355020bc1c3a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nhk7j\" (UID: \"f9a409f3-6983-4162-9c32-355020bc1c3a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.541158 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxntj\" (UniqueName: \"kubernetes.io/projected/f9a409f3-6983-4162-9c32-355020bc1c3a-kube-api-access-qxntj\") pod \"package-server-manager-789f6589d5-nhk7j\" (UID: \"f9a409f3-6983-4162-9c32-355020bc1c3a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.541228 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td7js\" (UniqueName: \"kubernetes.io/projected/bf5ed613-ce39-4081-8342-aef24fb3a385-kube-api-access-td7js\") pod \"catalog-operator-68c6474976-zp2th\" (UID: \"bf5ed613-ce39-4081-8342-aef24fb3a385\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" Nov 29 05:30:22 crc kubenswrapper[4594]: E1129 05:30:22.542193 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.042177099 +0000 UTC m=+147.282686319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.543568 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.543916 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nqmn\" (UniqueName: \"kubernetes.io/projected/9e3c263e-0845-4d3e-865f-e778b9ff3e53-kube-api-access-8nqmn\") pod \"service-ca-9c57cc56f-qkgqz\" (UID: \"9e3c263e-0845-4d3e-865f-e778b9ff3e53\") " pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.544279 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-trusted-ca\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.544276 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c584b692-db15-4724-a208-491507a8474e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k9wmb\" (UID: \"c584b692-db15-4724-a208-491507a8474e\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.544703 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-bound-sa-token\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.544964 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f-cert\") pod \"ingress-canary-wq549\" (UID: \"52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f\") " pod="openshift-ingress-canary/ingress-canary-wq549" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.545072 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-mountpoint-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.545093 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9e3c263e-0845-4d3e-865f-e778b9ff3e53-signing-cabundle\") pod \"service-ca-9c57cc56f-qkgqz\" (UID: \"9e3c263e-0845-4d3e-865f-e778b9ff3e53\") " pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.545129 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9200ef-4b0d-4e13-b06e-8f59d7e366ee-serving-cert\") pod \"service-ca-operator-777779d784-qdj7h\" (UID: \"bd9200ef-4b0d-4e13-b06e-8f59d7e366ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.545153 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9200ef-4b0d-4e13-b06e-8f59d7e366ee-config\") pod \"service-ca-operator-777779d784-qdj7h\" (UID: \"bd9200ef-4b0d-4e13-b06e-8f59d7e366ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.545168 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9e3c263e-0845-4d3e-865f-e778b9ff3e53-signing-key\") pod \"service-ca-9c57cc56f-qkgqz\" (UID: \"9e3c263e-0845-4d3e-865f-e778b9ff3e53\") " pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.545173 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9a409f3-6983-4162-9c32-355020bc1c3a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nhk7j\" (UID: \"f9a409f3-6983-4162-9c32-355020bc1c3a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.545192 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp7nm\" (UniqueName: \"kubernetes.io/projected/52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f-kube-api-access-wp7nm\") pod \"ingress-canary-wq549\" (UID: \"52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f\") " pod="openshift-ingress-canary/ingress-canary-wq549" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.545869 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0cb87a03-d210-426d-a903-56fd9b55852b-webhook-cert\") pod \"packageserver-d55dfcdfc-95k6p\" (UID: \"0cb87a03-d210-426d-a903-56fd9b55852b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.545920 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.545939 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c584b692-db15-4724-a208-491507a8474e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k9wmb\" (UID: \"c584b692-db15-4724-a208-491507a8474e\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.546000 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0cb87a03-d210-426d-a903-56fd9b55852b-tmpfs\") pod \"packageserver-d55dfcdfc-95k6p\" (UID: \"0cb87a03-d210-426d-a903-56fd9b55852b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.546230 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-registry-certificates\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.546383 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0cb87a03-d210-426d-a903-56fd9b55852b-tmpfs\") pod \"packageserver-d55dfcdfc-95k6p\" (UID: \"0cb87a03-d210-426d-a903-56fd9b55852b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.546461 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.546688 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd9d318d-3895-4fed-aa8b-faa43f64ded2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrhzj\" (UID: \"cd9d318d-3895-4fed-aa8b-faa43f64ded2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.546913 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9200ef-4b0d-4e13-b06e-8f59d7e366ee-config\") pod \"service-ca-operator-777779d784-qdj7h\" (UID: \"bd9200ef-4b0d-4e13-b06e-8f59d7e366ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.546914 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9e3c263e-0845-4d3e-865f-e778b9ff3e53-signing-cabundle\") pod \"service-ca-9c57cc56f-qkgqz\" (UID: \"9e3c263e-0845-4d3e-865f-e778b9ff3e53\") " pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.547547 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bf5ed613-ce39-4081-8342-aef24fb3a385-profile-collector-cert\") pod \"catalog-operator-68c6474976-zp2th\" (UID: \"bf5ed613-ce39-4081-8342-aef24fb3a385\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.547971 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-registry-tls\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.549609 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.550248 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0cb87a03-d210-426d-a903-56fd9b55852b-apiservice-cert\") pod \"packageserver-d55dfcdfc-95k6p\" (UID: \"0cb87a03-d210-426d-a903-56fd9b55852b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.551642 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c584b692-db15-4724-a208-491507a8474e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k9wmb\" (UID: \"c584b692-db15-4724-a208-491507a8474e\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.552193 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0cb87a03-d210-426d-a903-56fd9b55852b-webhook-cert\") pod \"packageserver-d55dfcdfc-95k6p\" (UID: \"0cb87a03-d210-426d-a903-56fd9b55852b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.554140 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9e3c263e-0845-4d3e-865f-e778b9ff3e53-signing-key\") pod \"service-ca-9c57cc56f-qkgqz\" (UID: \"9e3c263e-0845-4d3e-865f-e778b9ff3e53\") " pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.554867 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9200ef-4b0d-4e13-b06e-8f59d7e366ee-serving-cert\") pod \"service-ca-operator-777779d784-qdj7h\" (UID: \"bd9200ef-4b0d-4e13-b06e-8f59d7e366ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.554894 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bf5ed613-ce39-4081-8342-aef24fb3a385-srv-cert\") pod \"catalog-operator-68c6474976-zp2th\" (UID: \"bf5ed613-ce39-4081-8342-aef24fb3a385\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.555375 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd9d318d-3895-4fed-aa8b-faa43f64ded2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrhzj\" (UID: \"cd9d318d-3895-4fed-aa8b-faa43f64ded2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.580182 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.584091 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzphh\" (UniqueName: \"kubernetes.io/projected/cd9d318d-3895-4fed-aa8b-faa43f64ded2-kube-api-access-lzphh\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrhzj\" (UID: \"cd9d318d-3895-4fed-aa8b-faa43f64ded2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.606599 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t4f5\" (UniqueName: \"kubernetes.io/projected/0cb87a03-d210-426d-a903-56fd9b55852b-kube-api-access-2t4f5\") pod \"packageserver-d55dfcdfc-95k6p\" (UID: \"0cb87a03-d210-426d-a903-56fd9b55852b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.626206 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxntj\" (UniqueName: \"kubernetes.io/projected/f9a409f3-6983-4162-9c32-355020bc1c3a-kube-api-access-qxntj\") pod \"package-server-manager-789f6589d5-nhk7j\" (UID: \"f9a409f3-6983-4162-9c32-355020bc1c3a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.647852 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztcj8\" (UniqueName: \"kubernetes.io/projected/81d6a548-1be4-41a1-8972-ed027e6895aa-kube-api-access-ztcj8\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.647898 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-socket-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.647924 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-registration-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.647950 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-plugins-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.647981 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.647999 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-csi-data-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.648048 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f-cert\") pod \"ingress-canary-wq549\" (UID: \"52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f\") " pod="openshift-ingress-canary/ingress-canary-wq549" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.648073 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-mountpoint-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.648094 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp7nm\" (UniqueName: \"kubernetes.io/projected/52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f-kube-api-access-wp7nm\") pod \"ingress-canary-wq549\" (UID: \"52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f\") " pod="openshift-ingress-canary/ingress-canary-wq549" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.648453 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-socket-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.648506 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-registration-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.648534 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-plugins-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: E1129 05:30:22.648771 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.148758574 +0000 UTC m=+147.389267793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.649059 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-mountpoint-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.649135 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/81d6a548-1be4-41a1-8972-ed027e6895aa-csi-data-dir\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.652129 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grpmt\" (UniqueName: \"kubernetes.io/projected/bd9200ef-4b0d-4e13-b06e-8f59d7e366ee-kube-api-access-grpmt\") pod \"service-ca-operator-777779d784-qdj7h\" (UID: \"bd9200ef-4b0d-4e13-b06e-8f59d7e366ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.655358 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.661317 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.662150 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" event={"ID":"8797308e-2c6a-45d4-a7e8-dae633de1103","Type":"ContainerStarted","Data":"1da980b9ea1000f2fa42a1749cd72ea36a95d71ae77667be4ae827034b968f5c"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.662185 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" event={"ID":"8797308e-2c6a-45d4-a7e8-dae633de1103","Type":"ContainerStarted","Data":"e738dd8737d727acdf789725b3ba98988178ed22572713c842033f5a3d12a031"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.663790 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f-cert\") pod \"ingress-canary-wq549\" (UID: \"52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f\") " pod="openshift-ingress-canary/ingress-canary-wq549" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.666667 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9ctf8" event={"ID":"12b5360d-755a-4cb5-9ef3-0c00550e3913","Type":"ContainerStarted","Data":"7a2b3e3c27d2cbe26cc0c1aa79a95cb164547951b18c33927ca0d9e0b8aad23c"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.666698 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9ctf8" event={"ID":"12b5360d-755a-4cb5-9ef3-0c00550e3913","Type":"ContainerStarted","Data":"774a9707d2d257f0518eff54b02d61e96f981ec1554a9e832dfc528e026992af"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.669876 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p86wv\" (UniqueName: \"kubernetes.io/projected/d68c506f-d0c1-4889-ab10-b886ef7880a7-kube-api-access-p86wv\") pod \"migrator-59844c95c7-g9rtk\" (UID: \"d68c506f-d0c1-4889-ab10-b886ef7880a7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9rtk" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.671658 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" event={"ID":"ad8c6194-f489-42f3-afb2-707a97b490ad","Type":"ContainerStarted","Data":"0e5210e9d692dec5ccb4faa21ceb1617995911789678784b1cb9a9b51bd0ff4a"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.671692 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" event={"ID":"ad8c6194-f489-42f3-afb2-707a97b490ad","Type":"ContainerStarted","Data":"3b147e246ee6ffaa3240db5c50863ebd36a3d0fd8f81607cc6366bb3beab888c"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.675969 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.679836 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9rtk" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.684441 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.689526 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzfst\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-kube-api-access-zzfst\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.693007 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gv8n5"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.695249 4594 generic.go:334] "Generic (PLEG): container finished" podID="fd86a99b-7671-4dd9-88b9-334db1906b6b" containerID="4fe9d58ed404c8af61eb24e5f6f2c32500b46d41eb6eafe6f6592f7ec634971f" exitCode=0 Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.695359 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" event={"ID":"fd86a99b-7671-4dd9-88b9-334db1906b6b","Type":"ContainerDied","Data":"4fe9d58ed404c8af61eb24e5f6f2c32500b46d41eb6eafe6f6592f7ec634971f"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.695383 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" event={"ID":"fd86a99b-7671-4dd9-88b9-334db1906b6b","Type":"ContainerStarted","Data":"a7d609135d0a42af0d15e90e13ea158fcc6bf653d146801943833a126c983991"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.707994 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.713190 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nqmn\" (UniqueName: \"kubernetes.io/projected/9e3c263e-0845-4d3e-865f-e778b9ff3e53-kube-api-access-8nqmn\") pod \"service-ca-9c57cc56f-qkgqz\" (UID: \"9e3c263e-0845-4d3e-865f-e778b9ff3e53\") " pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.725604 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xg8s7" event={"ID":"6c685741-a472-49fe-8237-520bea0232ef","Type":"ContainerStarted","Data":"e2eafa920c003169afb24e67541eb68e59d33eadef7e45e211acf34786a00225"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.730309 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.734316 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td7js\" (UniqueName: \"kubernetes.io/projected/bf5ed613-ce39-4081-8342-aef24fb3a385-kube-api-access-td7js\") pod \"catalog-operator-68c6474976-zp2th\" (UID: \"bf5ed613-ce39-4081-8342-aef24fb3a385\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.735424 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" event={"ID":"96330a2a-06f0-4396-bceb-49bf9e3f3407","Type":"ContainerStarted","Data":"774331e88738fe2d8c6c6b09a2547d478b30e8651032845a78f7f4dce6551c4b"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.735461 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" event={"ID":"96330a2a-06f0-4396-bceb-49bf9e3f3407","Type":"ContainerStarted","Data":"6f2d651444d5aa9ceab89f494e6dbcb057b937adb267d853f20e42fe69ded0a5"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.738624 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.742367 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.744126 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" event={"ID":"fc2b6738-957d-4636-a723-c11207f29087","Type":"ContainerStarted","Data":"94c212b315e44e9411e4805f458604802ddc3c4a9efcff5dfdd4e0e28cefca9b"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.747852 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" event={"ID":"911e02cc-6184-4c63-a36c-a51b54d0e7bf","Type":"ContainerStarted","Data":"63747cf938c38cd3dd9abfe2c55ca2f1dbdbc66a1f0832cec17bae1d45ef827f"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.751173 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh2z7\" (UniqueName: \"kubernetes.io/projected/c584b692-db15-4724-a208-491507a8474e-kube-api-access-mh2z7\") pod \"marketplace-operator-79b997595-k9wmb\" (UID: \"c584b692-db15-4724-a208-491507a8474e\") " pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.752793 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" event={"ID":"8b141ba2-87a7-4975-a1ed-054769e567bd","Type":"ContainerStarted","Data":"4cf4a8bc7f38a4131863b2ab66548de60103b1e684b42c8ca7f28ea82d9fd220"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.752831 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" event={"ID":"8b141ba2-87a7-4975-a1ed-054769e567bd","Type":"ContainerStarted","Data":"bd1dc0841060e1abf4f3d03f2a2944a23b56371847645149499a10695532b2e2"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.753326 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:22 crc kubenswrapper[4594]: E1129 05:30:22.753448 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.253428005 +0000 UTC m=+147.493937226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.754581 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: E1129 05:30:22.755482 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.255469139 +0000 UTC m=+147.495978360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.770458 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" event={"ID":"09980e91-b2e4-4a0e-bee7-dc101096f804","Type":"ContainerStarted","Data":"883dc7c3eeb91b731b1bd886440dac02bb7c0806e4d51fb1e90163e96eb68b40"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.770875 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-bound-sa-token\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.789576 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" event={"ID":"652260eb-911a-4b92-8cea-73c9e8f156c4","Type":"ContainerStarted","Data":"1fc0fb7ba1ed2dec40f5f72319fc2364bdf8c835465b1fd239fed6cdeb203b21"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.796979 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" event={"ID":"864d1c0b-3c85-4472-9d16-c8d5c574e02a","Type":"ContainerStarted","Data":"4ef591495522298ee55bf70323d52dd483bbe98056d92534ed3a10c74ff2f22d"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.797015 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" event={"ID":"864d1c0b-3c85-4472-9d16-c8d5c574e02a","Type":"ContainerStarted","Data":"f7eccd1754457a8922936f3bbc837f9ea4d0400d7db11c4dc6f8ab09b7feedcd"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.797332 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.798788 4594 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-scxzk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.798815 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" podUID="864d1c0b-3c85-4472-9d16-c8d5c574e02a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.801287 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" event={"ID":"81534f91-740b-487b-9149-e8565ccf9905","Type":"ContainerStarted","Data":"2799285ca2d124fe83e7779b4fd87893581da049099e363955376bf05d2b70d0"} Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.813664 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp7nm\" (UniqueName: \"kubernetes.io/projected/52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f-kube-api-access-wp7nm\") pod \"ingress-canary-wq549\" (UID: \"52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f\") " pod="openshift-ingress-canary/ingress-canary-wq549" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.827083 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.834225 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztcj8\" (UniqueName: \"kubernetes.io/projected/81d6a548-1be4-41a1-8972-ed027e6895aa-kube-api-access-ztcj8\") pod \"csi-hostpathplugin-pmjzq\" (UID: \"81d6a548-1be4-41a1-8972-ed027e6895aa\") " pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.855992 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:22 crc kubenswrapper[4594]: E1129 05:30:22.856695 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.356682171 +0000 UTC m=+147.597191391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:22 crc kubenswrapper[4594]: W1129 05:30:22.857550 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcce54ab2_f931_4499_a0c1_8a2b2674aeb2.slice/crio-ba3d5c80f8cbe659abe6ecb61111e5c5fa80fdd5ed07fccb4836d9feff4919e6 WatchSource:0}: Error finding container ba3d5c80f8cbe659abe6ecb61111e5c5fa80fdd5ed07fccb4836d9feff4919e6: Status 404 returned error can't find the container with id ba3d5c80f8cbe659abe6ecb61111e5c5fa80fdd5ed07fccb4836d9feff4919e6 Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.859873 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.882435 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wpkbq"] Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.957369 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:22 crc kubenswrapper[4594]: E1129 05:30:22.957856 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.45783573 +0000 UTC m=+147.698344950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:22 crc kubenswrapper[4594]: I1129 05:30:22.995879 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.002229 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pjh6b"] Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.009226 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.037308 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.059001 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:23 crc kubenswrapper[4594]: E1129 05:30:23.059170 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.55913961 +0000 UTC m=+147.799648831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.059307 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:23 crc kubenswrapper[4594]: E1129 05:30:23.059657 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.559643974 +0000 UTC m=+147.800153193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.062570 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wq549" Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.081512 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.105127 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt"] Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.126410 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck"] Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.160318 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:23 crc kubenswrapper[4594]: E1129 05:30:23.160772 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.660756046 +0000 UTC m=+147.901265265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.179278 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jgzzn"] Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.236478 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d"] Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.264624 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:23 crc kubenswrapper[4594]: E1129 05:30:23.265101 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.765088245 +0000 UTC m=+148.005597466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.277221 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt"] Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.367070 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:23 crc kubenswrapper[4594]: E1129 05:30:23.367219 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.867197622 +0000 UTC m=+148.107706842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.367326 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:23 crc kubenswrapper[4594]: E1129 05:30:23.367636 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.867626594 +0000 UTC m=+148.108135813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.400217 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v"] Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.440320 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g4pv8"] Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.468487 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:23 crc kubenswrapper[4594]: E1129 05:30:23.468671 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.968653806 +0000 UTC m=+148.209163027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.470521 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:23 crc kubenswrapper[4594]: E1129 05:30:23.470884 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:23.97087061 +0000 UTC m=+148.211379830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:23 crc kubenswrapper[4594]: W1129 05:30:23.491097 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8a99cf8_64bd_4e83_b77e_c4358154f10e.slice/crio-2fdbd1fa4bfc89677619b4395e809537f9ce15405112641a5b4301dbcd21cc7c WatchSource:0}: Error finding container 2fdbd1fa4bfc89677619b4395e809537f9ce15405112641a5b4301dbcd21cc7c: Status 404 returned error can't find the container with id 2fdbd1fa4bfc89677619b4395e809537f9ce15405112641a5b4301dbcd21cc7c Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.500449 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p"] Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.545661 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" podStartSLOduration=126.545642548 podStartE2EDuration="2m6.545642548s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:23.543903778 +0000 UTC m=+147.784412999" watchObservedRunningTime="2025-11-29 05:30:23.545642548 +0000 UTC m=+147.786151759" Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.571219 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:23 crc kubenswrapper[4594]: E1129 05:30:23.575781 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:24.075756306 +0000 UTC m=+148.316265526 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.575861 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:23 crc kubenswrapper[4594]: E1129 05:30:23.581239 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:24.081210029 +0000 UTC m=+148.321719249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.589859 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr"] Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.608115 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h"] Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.678528 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj"] Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.685561 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:23 crc kubenswrapper[4594]: E1129 05:30:23.685870 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:24.185843864 +0000 UTC m=+148.426353084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.790376 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:23 crc kubenswrapper[4594]: E1129 05:30:23.790797 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:24.29078345 +0000 UTC m=+148.531292669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.795922 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4n7dc" podStartSLOduration=126.795903209 podStartE2EDuration="2m6.795903209s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:23.743709292 +0000 UTC m=+147.984218512" watchObservedRunningTime="2025-11-29 05:30:23.795903209 +0000 UTC m=+148.036412429" Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.891906 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:23 crc kubenswrapper[4594]: E1129 05:30:23.892734 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:24.39271876 +0000 UTC m=+148.633227981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.919850 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt" event={"ID":"1fea2129-7ad0-45d8-9447-315107ef1c0c","Type":"ContainerStarted","Data":"167617cb94f7d6f2568153d13c6080489983201f9848d82269cd60e960d11397"} Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.975397 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pmjzq"] Nov 29 05:30:23 crc kubenswrapper[4594]: I1129 05:30:23.978618 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wpkbq" event={"ID":"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3","Type":"ContainerStarted","Data":"ce020ce0936bb337302ed300f6b9267c24f7235e47c378a5823a5f003807a96d"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.022238 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:24 crc kubenswrapper[4594]: E1129 05:30:24.024087 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:24.524073109 +0000 UTC m=+148.764582329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.030428 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j"] Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.042891 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th"] Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.052506 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" event={"ID":"652260eb-911a-4b92-8cea-73c9e8f156c4","Type":"ContainerStarted","Data":"0e0e3257b5f508dae9810188a0ad1ca6db901a70732dcc72b62da7e20d1eb528"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.053579 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.079541 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-g9rtk"] Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.129933 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.130692 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" event={"ID":"e247f646-534e-4aca-8cd9-edadba75848b","Type":"ContainerStarted","Data":"fd1d6e713d8dd323016c4079cf3afa8dfa16a9953842a5ed9981dcc9d45e043c"} Nov 29 05:30:24 crc kubenswrapper[4594]: E1129 05:30:24.130856 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:24.630843557 +0000 UTC m=+148.871352777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.131110 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.145769 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8pvtn" event={"ID":"c3e4199d-2fe7-4039-8f59-b8bab2beebd0","Type":"ContainerStarted","Data":"acb8946ed2d8c53e635c7d8a56532f1ce53b0303e78c782459a64c4127cbbfd3"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.146888 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" event={"ID":"7af45636-74c3-4a18-8297-ba3fdde46ace","Type":"ContainerStarted","Data":"e070612e419578b214eb43c9f5c72146b5b439fc4a7b7c8a04ba6acb8fd5e596"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.175433 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qkgqz"] Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.193505 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" event={"ID":"b8a99cf8-64bd-4e83-b77e-c4358154f10e","Type":"ContainerStarted","Data":"2fdbd1fa4bfc89677619b4395e809537f9ce15405112641a5b4301dbcd21cc7c"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.202771 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xg8s7" event={"ID":"6c685741-a472-49fe-8237-520bea0232ef","Type":"ContainerStarted","Data":"23f9e55425f765209a8e7f70e22c1b7bf57184ebd3897cf1fa7b9420fe283ef7"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.203392 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-xg8s7" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.213398 4594 patch_prober.go:28] interesting pod/downloads-7954f5f757-xg8s7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.213444 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xg8s7" podUID="6c685741-a472-49fe-8237-520bea0232ef" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.238894 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:24 crc kubenswrapper[4594]: E1129 05:30:24.239228 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:24.739217112 +0000 UTC m=+148.979726332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.244516 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" event={"ID":"969b3fd9-7b46-4627-bda2-e2fb4a4f5203","Type":"ContainerStarted","Data":"4ef2302a54f466dd2214d9f0de14edeebaa58d048479694c2162e434b451d2e0"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.256025 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k9wmb"] Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.275390 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-9ctf8" podStartSLOduration=127.275375355 podStartE2EDuration="2m7.275375355s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:24.258938945 +0000 UTC m=+148.499448165" watchObservedRunningTime="2025-11-29 05:30:24.275375355 +0000 UTC m=+148.515884576" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.277351 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wq549"] Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.289799 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" event={"ID":"cce54ab2-f931-4499-a0c1-8a2b2674aeb2","Type":"ContainerStarted","Data":"f0b38da2fcd462817b9a6ab3e28bd24d5047ee22af4bc2c95966ee6359ce5bf1"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.289839 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" event={"ID":"cce54ab2-f931-4499-a0c1-8a2b2674aeb2","Type":"ContainerStarted","Data":"ba3d5c80f8cbe659abe6ecb61111e5c5fa80fdd5ed07fccb4836d9feff4919e6"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.331644 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" event={"ID":"8b141ba2-87a7-4975-a1ed-054769e567bd","Type":"ContainerStarted","Data":"c5c712f9014fcca0f0bad5c2cdf8a2b8a434e178c3f121504b3a79b900eeb75c"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.340599 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:24 crc kubenswrapper[4594]: E1129 05:30:24.341441 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:24.841415594 +0000 UTC m=+149.081924815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.354523 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8b0515f71b07c136556445dd7f05041ffa499f1efc5064b08061fe31edbc8b4e"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.355185 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.365910 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" event={"ID":"0cb87a03-d210-426d-a903-56fd9b55852b","Type":"ContainerStarted","Data":"fd29d3a10fcb5e948c721bb9a01411cfbcb7837c9a16db15601ba3b27cd8b877"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.375889 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" event={"ID":"837571cc-8fd2-4102-9997-4520ddf6da08","Type":"ContainerStarted","Data":"79e6e80f76890aab59e250a7c8a03f9a17d1dc281b8d9b83b9d0ed58fe4488b0"} Nov 29 05:30:24 crc kubenswrapper[4594]: W1129 05:30:24.379705 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e3c263e_0845_4d3e_865f_e778b9ff3e53.slice/crio-33713c831ba8956b23883486c5bf715f63983df245179e6c626acd9cd150f76b WatchSource:0}: Error finding container 33713c831ba8956b23883486c5bf715f63983df245179e6c626acd9cd150f76b: Status 404 returned error can't find the container with id 33713c831ba8956b23883486c5bf715f63983df245179e6c626acd9cd150f76b Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.397786 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" event={"ID":"bd9200ef-4b0d-4e13-b06e-8f59d7e366ee","Type":"ContainerStarted","Data":"8fc3614c7c810f38725302f1f9f8919952938551a104b02080d938e4423c2cc3"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.399355 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-x5tn8" podStartSLOduration=127.399336075 podStartE2EDuration="2m7.399336075s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:24.398082381 +0000 UTC m=+148.638591591" watchObservedRunningTime="2025-11-29 05:30:24.399336075 +0000 UTC m=+148.639845294" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.411175 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" event={"ID":"873587ac-5d12-45c4-bfd5-42bc08d29f65","Type":"ContainerStarted","Data":"0706d51c8d419c700a90733df99b46ebd6ee7228cd9d38b6f8cbb0832a8c6b2d"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.420382 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g4pv8" event={"ID":"d1bc9264-d190-49c0-9122-7cc123dd88e5","Type":"ContainerStarted","Data":"2f393d743efcb70e667233469619636ee940c74746d843c7f7871a6b18ea29ee"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.427067 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" event={"ID":"09980e91-b2e4-4a0e-bee7-dc101096f804","Type":"ContainerStarted","Data":"bdc88784644f2064f591d4594666df06068b4d2554b811cb99429d52abb6b4b9"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.439849 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" event={"ID":"fc2b6738-957d-4636-a723-c11207f29087","Type":"ContainerStarted","Data":"f80c664fe46aa3d1aebd7951e3beebdc8058879b225b3934a4e87e8d61c76bf6"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.440149 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.444842 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:24 crc kubenswrapper[4594]: E1129 05:30:24.445153 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:24.945138746 +0000 UTC m=+149.185647967 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.460861 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gv8n5" event={"ID":"6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c","Type":"ContainerStarted","Data":"4ec2a785d4ead81b7993d57694598bdde135f1598b6784b3e97770b9f2188d8e"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.462747 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a853ee61e80e952c6276a652feb94dabcd127381d2bc02512274133d43f6ffed"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.464390 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8jv2k" event={"ID":"5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791","Type":"ContainerStarted","Data":"954343be9e6882f0e97df63ae057d02d79a00edc8fddb205743878e448eac3d1"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.464414 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8jv2k" event={"ID":"5d58f2a9-3eb4-4bf3-a27c-4df9ddcd5791","Type":"ContainerStarted","Data":"770d189b33ee2b7ae196f96574962e48f8cb7bc60ebe9c58b44e59cc62027999"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.475287 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" event={"ID":"ad8c6194-f489-42f3-afb2-707a97b490ad","Type":"ContainerStarted","Data":"baec5ee7fa63f2579dd0efe248af266f1ecc5f3266b0c7d7202b421f3dc16840"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.489955 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" event={"ID":"cd9d318d-3895-4fed-aa8b-faa43f64ded2","Type":"ContainerStarted","Data":"5a947e788432fe354ca002c690cc7242899ebf661f253122e95824ca96d5db6d"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.512915 4594 generic.go:334] "Generic (PLEG): container finished" podID="81534f91-740b-487b-9149-e8565ccf9905" containerID="d9ac188de56b246344c6cf9e7c45b5dce4ec564be7ae5fd81f84dc46bb5e3111" exitCode=0 Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.513035 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" event={"ID":"81534f91-740b-487b-9149-e8565ccf9905","Type":"ContainerDied","Data":"d9ac188de56b246344c6cf9e7c45b5dce4ec564be7ae5fd81f84dc46bb5e3111"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.516954 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v" event={"ID":"637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5","Type":"ContainerStarted","Data":"304a2b97a32930520c529eba9a342ed7ece58c5cc47869d3a1b4260dbe1eb74d"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.518597 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" event={"ID":"b5059cab-0e22-479f-9079-b031f405e547","Type":"ContainerStarted","Data":"130081b2918769b1e6214a194301b433b5d996686cc35a5e69e4e2f004e0084c"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.520605 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1e69f416fd05033a3d1dfef2bc86a4325a77a21187b98fc80f2c07cfab03c2e4"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.520677 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"86d7c7e02f8b5fdecce485567bbafd6cd3d5666d0523fa31b065be90f682844b"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.539743 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" event={"ID":"724c0c5a-3d4d-41ef-9759-f9a1b62c99fe","Type":"ContainerStarted","Data":"c2dc6748bd2e100361be2e47c5ca2d49e9ba7671397691b9312692e4307eeaf5"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.542159 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" podStartSLOduration=127.542143922 podStartE2EDuration="2m7.542143922s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:24.540724529 +0000 UTC m=+148.781233749" watchObservedRunningTime="2025-11-29 05:30:24.542143922 +0000 UTC m=+148.782653132" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.545563 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" event={"ID":"388fd604-2764-4184-b305-ce8f8e31ffa1","Type":"ContainerStarted","Data":"2b43d9883f76b388e631b281af14332becbba9934c1882dd32449629bbac81b4"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.545607 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" event={"ID":"388fd604-2764-4184-b305-ce8f8e31ffa1","Type":"ContainerStarted","Data":"1287d4fc7392e2105392f1508b2fde81f85ed36287ef16e4dbc0d174eb8fd5da"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.547393 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:24 crc kubenswrapper[4594]: E1129 05:30:24.551583 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:25.051567997 +0000 UTC m=+149.292077217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.553113 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:24 crc kubenswrapper[4594]: E1129 05:30:24.557238 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:25.057230642 +0000 UTC m=+149.297739861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.565107 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" event={"ID":"911e02cc-6184-4c63-a36c-a51b54d0e7bf","Type":"ContainerStarted","Data":"275771afd21989408f7163c808e0cc4bcafac219cc32de2604210abdd903779c"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.578330 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pjh6b" event={"ID":"05218cdc-d6bb-47ef-aab3-9a5e0dab68b8","Type":"ContainerStarted","Data":"7a44d7929ab3b0b43816221ead17ec5b548c39aa6de24bd3cccaec9d5b96fa15"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.597911 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" event={"ID":"fd86a99b-7671-4dd9-88b9-334db1906b6b","Type":"ContainerStarted","Data":"1c92d06eb5461c13b0923dbe131721262363a79f15ac271304157a61f0dfd197"} Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.597948 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.603763 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.661301 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:24 crc kubenswrapper[4594]: E1129 05:30:24.661691 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:25.161662237 +0000 UTC m=+149.402171457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.668567 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.673629 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-xg8s7" podStartSLOduration=127.673610801 podStartE2EDuration="2m7.673610801s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:24.673026499 +0000 UTC m=+148.913535719" watchObservedRunningTime="2025-11-29 05:30:24.673610801 +0000 UTC m=+148.914120011" Nov 29 05:30:24 crc kubenswrapper[4594]: E1129 05:30:24.676643 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:25.176622841 +0000 UTC m=+149.417132061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.699777 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.702796 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8jv2k" podStartSLOduration=5.702778288 podStartE2EDuration="5.702778288s" podCreationTimestamp="2025-11-29 05:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:24.6998528 +0000 UTC m=+148.940362021" watchObservedRunningTime="2025-11-29 05:30:24.702778288 +0000 UTC m=+148.943287508" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.789533 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" podStartSLOduration=127.789511624 podStartE2EDuration="2m7.789511624s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:24.787760561 +0000 UTC m=+149.028269781" watchObservedRunningTime="2025-11-29 05:30:24.789511624 +0000 UTC m=+149.030020845" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.810316 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:24 crc kubenswrapper[4594]: E1129 05:30:24.810781 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:25.310767294 +0000 UTC m=+149.551276514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.839758 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-p9km8" podStartSLOduration=127.839742653 podStartE2EDuration="2m7.839742653s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:24.837493279 +0000 UTC m=+149.078002499" watchObservedRunningTime="2025-11-29 05:30:24.839742653 +0000 UTC m=+149.080251873" Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.924190 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:24 crc kubenswrapper[4594]: E1129 05:30:24.924612 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:25.424600293 +0000 UTC m=+149.665109513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:24 crc kubenswrapper[4594]: I1129 05:30:24.959350 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4qlf7" podStartSLOduration=127.959333324 podStartE2EDuration="2m7.959333324s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:24.887884687 +0000 UTC m=+149.128393908" watchObservedRunningTime="2025-11-29 05:30:24.959333324 +0000 UTC m=+149.199842544" Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:24.998358 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" podStartSLOduration=127.998344321 podStartE2EDuration="2m7.998344321s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:24.960968481 +0000 UTC m=+149.201477700" watchObservedRunningTime="2025-11-29 05:30:24.998344321 +0000 UTC m=+149.238853541" Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.004867 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8t4sg" podStartSLOduration=128.004839981 podStartE2EDuration="2m8.004839981s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:24.996622372 +0000 UTC m=+149.237131593" watchObservedRunningTime="2025-11-29 05:30:25.004839981 +0000 UTC m=+149.245349192" Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.028632 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:25 crc kubenswrapper[4594]: E1129 05:30:25.029205 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:25.529188553 +0000 UTC m=+149.769697773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.114372 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" podStartSLOduration=25.114332849 podStartE2EDuration="25.114332849s" podCreationTimestamp="2025-11-29 05:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:25.112788752 +0000 UTC m=+149.353297972" watchObservedRunningTime="2025-11-29 05:30:25.114332849 +0000 UTC m=+149.354842069" Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.114476 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-jsnt5" podStartSLOduration=128.114469624 podStartE2EDuration="2m8.114469624s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:25.076020658 +0000 UTC m=+149.316529878" watchObservedRunningTime="2025-11-29 05:30:25.114469624 +0000 UTC m=+149.354978844" Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.142079 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:25 crc kubenswrapper[4594]: E1129 05:30:25.142397 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:25.642386674 +0000 UTC m=+149.882895895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.157322 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2fqlt" podStartSLOduration=128.15730585 podStartE2EDuration="2m8.15730585s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:25.139161917 +0000 UTC m=+149.379671138" watchObservedRunningTime="2025-11-29 05:30:25.15730585 +0000 UTC m=+149.397815071" Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.183608 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" podStartSLOduration=128.183588206 podStartE2EDuration="2m8.183588206s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:25.17649135 +0000 UTC m=+149.417000570" watchObservedRunningTime="2025-11-29 05:30:25.183588206 +0000 UTC m=+149.424097426" Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.265117 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" podStartSLOduration=128.265084302 podStartE2EDuration="2m8.265084302s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:25.231197665 +0000 UTC m=+149.471706885" watchObservedRunningTime="2025-11-29 05:30:25.265084302 +0000 UTC m=+149.505593522" Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.272496 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:25 crc kubenswrapper[4594]: E1129 05:30:25.272962 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:25.772938203 +0000 UTC m=+150.013447423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.375009 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:25 crc kubenswrapper[4594]: E1129 05:30:25.375420 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:25.875406249 +0000 UTC m=+150.115915469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.478173 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:25 crc kubenswrapper[4594]: E1129 05:30:25.478869 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:25.978854788 +0000 UTC m=+150.219364008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.581585 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:25 crc kubenswrapper[4594]: E1129 05:30:25.581941 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:26.081927415 +0000 UTC m=+150.322436635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.633502 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pjh6b" event={"ID":"05218cdc-d6bb-47ef-aab3-9a5e0dab68b8","Type":"ContainerStarted","Data":"b5ea6a5f0c65ddfe150d6a1247d9c25a90034025ec1cfb9a5d9d17e10c4c39f6"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.653567 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" event={"ID":"7af45636-74c3-4a18-8297-ba3fdde46ace","Type":"ContainerStarted","Data":"7920d744760e551f49211be15dcd8a222febf4cb1a86986599187d502b00fddd"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.684015 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9n22d" podStartSLOduration=128.683994842 podStartE2EDuration="2m8.683994842s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:25.682707146 +0000 UTC m=+149.923216355" watchObservedRunningTime="2025-11-29 05:30:25.683994842 +0000 UTC m=+149.924504062" Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.684998 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5nnp" event={"ID":"724c0c5a-3d4d-41ef-9759-f9a1b62c99fe","Type":"ContainerStarted","Data":"5050f2884ffdab1455221df931b1a94305640ce88a4de6414d8e0517c9924cd3"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.686213 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:25 crc kubenswrapper[4594]: E1129 05:30:25.686393 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:26.186374139 +0000 UTC m=+150.426883359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.686512 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:25 crc kubenswrapper[4594]: E1129 05:30:25.687034 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:26.187006351 +0000 UTC m=+150.427515571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.693213 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" event={"ID":"81d6a548-1be4-41a1-8972-ed027e6895aa","Type":"ContainerStarted","Data":"86ebfc12c44b1268ea0e121b6959c07b12771f1cafd7d1918400eb0ee91cae2f"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.704211 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" event={"ID":"f9a409f3-6983-4162-9c32-355020bc1c3a","Type":"ContainerStarted","Data":"545a00763f7e7188036a236152b8b5cf20bba19ec57846b65f64641d2352b33a"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.704244 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" event={"ID":"f9a409f3-6983-4162-9c32-355020bc1c3a","Type":"ContainerStarted","Data":"e7e3a91edd881ca89dc13bd8351a09291daa72e8a6661af50f5494777ad90bb0"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.716510 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" event={"ID":"b8a99cf8-64bd-4e83-b77e-c4358154f10e","Type":"ContainerStarted","Data":"c6059d117f6c2cc5fa6c8c2f845ff860602e1faa183a2730dec7f2d2d11147ca"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.753830 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" event={"ID":"e247f646-534e-4aca-8cd9-edadba75848b","Type":"ContainerStarted","Data":"0a320b3efa3dfcd5f1ddbeb621b24685559505ab609c0cc6fe2a8ec5dfb6fb89"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.756983 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.762247 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8pvtn" event={"ID":"c3e4199d-2fe7-4039-8f59-b8bab2beebd0","Type":"ContainerStarted","Data":"5d36acc4ff29b186c477cba3277e68b63d3c541cb093d61f98e8f4cf0513c5c2"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.764318 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g4pv8" event={"ID":"d1bc9264-d190-49c0-9122-7cc123dd88e5","Type":"ContainerStarted","Data":"94ac456bc64eb74dd0d3f19cac2dc669675a5d5294322a5f5c552fae4588027d"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.782164 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.790664 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:25 crc kubenswrapper[4594]: E1129 05:30:25.792166 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:26.292149737 +0000 UTC m=+150.532658957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.803847 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v" event={"ID":"637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5","Type":"ContainerStarted","Data":"1154bc3811e310c6cfca1739f9b6260a2a7bfb6eb2d2b5d848a950df1a77f919"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.852944 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mjfr" podStartSLOduration=128.852929433 podStartE2EDuration="2m8.852929433s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:25.852470576 +0000 UTC m=+150.092979795" watchObservedRunningTime="2025-11-29 05:30:25.852929433 +0000 UTC m=+150.093438653" Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.854883 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d9210fa1bf0e881bb14c044dee4abe1d32686c935a0c3c7173d8074a71195687"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.865175 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" event={"ID":"bd9200ef-4b0d-4e13-b06e-8f59d7e366ee","Type":"ContainerStarted","Data":"d7d2528da128aeaf2c6945e6aaf07b9616d65ef0b2b0ff918b7bb0c22f7bb346"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.886726 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wq549" event={"ID":"52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f","Type":"ContainerStarted","Data":"9732fa1307cf198c0692bae833d556eb4a0e39dea541c4adb4437fbcb2189670"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.892460 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:25 crc kubenswrapper[4594]: E1129 05:30:25.895450 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:26.39543463 +0000 UTC m=+150.635943841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.907736 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f9e44ca4dd4bdb2135454703c94a32b4b45bbe95e51baaab58a49d073f195785"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.924652 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt" event={"ID":"1fea2129-7ad0-45d8-9447-315107ef1c0c","Type":"ContainerStarted","Data":"d52f7b68f77981fc9bd8bde42cf81cf80cc916f8df661bc432b64210f8571b6d"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.955443 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wpkbq" event={"ID":"93df4c2b-9b0b-476e-a92f-55cb8e3ec8c3","Type":"ContainerStarted","Data":"d6f306adebf8e3411ffaeba0331185562e8dc97bc8a64100e05298d876180b67"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.956172 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.983897 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" event={"ID":"837571cc-8fd2-4102-9997-4520ddf6da08","Type":"ContainerStarted","Data":"38ecd6c114534cf827d8ec905058caf143f3c2385c9043cf72d7a820a8bff9bb"} Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.993085 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:25 crc kubenswrapper[4594]: E1129 05:30:25.993955 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:26.493941574 +0000 UTC m=+150.734450794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:25 crc kubenswrapper[4594]: I1129 05:30:25.999529 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" event={"ID":"9e3c263e-0845-4d3e-865f-e778b9ff3e53","Type":"ContainerStarted","Data":"33713c831ba8956b23883486c5bf715f63983df245179e6c626acd9cd150f76b"} Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.019415 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" event={"ID":"cd9d318d-3895-4fed-aa8b-faa43f64ded2","Type":"ContainerStarted","Data":"d1ffa6a456ebf4f0b4a2c3aefd2f0083af48900750a4ef817fe2e41e1d7194e0"} Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.046650 4594 generic.go:334] "Generic (PLEG): container finished" podID="873587ac-5d12-45c4-bfd5-42bc08d29f65" containerID="319d8f5a20819a0a2874f812e8784a79b8dfc06ec6cf361337f804b396d3e201" exitCode=0 Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.046705 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" event={"ID":"873587ac-5d12-45c4-bfd5-42bc08d29f65","Type":"ContainerDied","Data":"319d8f5a20819a0a2874f812e8784a79b8dfc06ec6cf361337f804b396d3e201"} Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.056773 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" event={"ID":"0cb87a03-d210-426d-a903-56fd9b55852b","Type":"ContainerStarted","Data":"de9ac7d67a2c33bc584a36eafc95f1acb4ae906d05e99cc483e8382026742bbc"} Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.057342 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.063498 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f9dqp" event={"ID":"09980e91-b2e4-4a0e-bee7-dc101096f804","Type":"ContainerStarted","Data":"7241e011f724849aca5d7bb08f8cb0dd5507c4f69a672a8ac02ed9dfa0a2c911"} Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.070193 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" event={"ID":"c584b692-db15-4724-a208-491507a8474e","Type":"ContainerStarted","Data":"2b6929827669e8b121dca40b372f59d2cf862f4027a8d5f8fc0cdc449575c3d6"} Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.070225 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" event={"ID":"c584b692-db15-4724-a208-491507a8474e","Type":"ContainerStarted","Data":"40f3a3bfc4f4b842c3f55bc1198befae401b30ba53f91ac872864779204899be"} Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.070832 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.072414 4594 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-k9wmb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.072452 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" podUID="c584b692-db15-4724-a208-491507a8474e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.079795 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9rtk" event={"ID":"d68c506f-d0c1-4889-ab10-b886ef7880a7","Type":"ContainerStarted","Data":"f138a258b6ffe8827c13391e81b65e47aa123443ff8402d9ab29e9896668490d"} Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.097966 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:26 crc kubenswrapper[4594]: E1129 05:30:26.100898 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:26.600882811 +0000 UTC m=+150.841392031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.104044 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" event={"ID":"969b3fd9-7b46-4627-bda2-e2fb4a4f5203","Type":"ContainerStarted","Data":"e114ece263a27ea9f02093e761963d68719826bbad6d3e793604dc19beb75239"} Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.119520 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8pvtn" podStartSLOduration=129.119503085 podStartE2EDuration="2m9.119503085s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.064679663 +0000 UTC m=+150.305188872" watchObservedRunningTime="2025-11-29 05:30:26.119503085 +0000 UTC m=+150.360012305" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.120420 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" podStartSLOduration=129.120411092 podStartE2EDuration="2m9.120411092s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.120403899 +0000 UTC m=+150.360913118" watchObservedRunningTime="2025-11-29 05:30:26.120411092 +0000 UTC m=+150.360920312" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.127333 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" event={"ID":"b5059cab-0e22-479f-9079-b031f405e547","Type":"ContainerStarted","Data":"8eae648ce95bfb31bd79da0a697d004da9431becd62d6f6dddac172b6a954eab"} Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.145142 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" event={"ID":"bf5ed613-ce39-4081-8342-aef24fb3a385","Type":"ContainerStarted","Data":"e0f59cfca8a4ce63cfc84e4d8a1f91e76b79fd1094cfe33db2d9d07d42393cc0"} Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.145836 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.148189 4594 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-zp2th container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.148241 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" podUID="bf5ed613-ce39-4081-8342-aef24fb3a385" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.151858 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gv8n5" event={"ID":"6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c","Type":"ContainerStarted","Data":"56d89c715d872a7cfef935d28fc1c87371679e6bc4ccdb288a873d129794df90"} Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.153871 4594 patch_prober.go:28] interesting pod/downloads-7954f5f757-xg8s7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.153902 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xg8s7" podUID="6c685741-a472-49fe-8237-520bea0232ef" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.173089 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trhms" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.174990 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" podStartSLOduration=129.174960822 podStartE2EDuration="2m9.174960822s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.160756983 +0000 UTC m=+150.401266204" watchObservedRunningTime="2025-11-29 05:30:26.174960822 +0000 UTC m=+150.415470042" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.189891 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qdj7h" podStartSLOduration=129.189867455 podStartE2EDuration="2m9.189867455s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.183788854 +0000 UTC m=+150.424298073" watchObservedRunningTime="2025-11-29 05:30:26.189867455 +0000 UTC m=+150.430376675" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.201314 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:26 crc kubenswrapper[4594]: E1129 05:30:26.210671 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:26.710642606 +0000 UTC m=+150.951151826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.234526 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:26 crc kubenswrapper[4594]: E1129 05:30:26.239726 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:26.739706051 +0000 UTC m=+150.980215270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.267033 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" podStartSLOduration=129.267016978 podStartE2EDuration="2m9.267016978s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.228056285 +0000 UTC m=+150.468565505" watchObservedRunningTime="2025-11-29 05:30:26.267016978 +0000 UTC m=+150.507526197" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.283389 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wq549" podStartSLOduration=7.283374002 podStartE2EDuration="7.283374002s" podCreationTimestamp="2025-11-29 05:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.264537143 +0000 UTC m=+150.505046362" watchObservedRunningTime="2025-11-29 05:30:26.283374002 +0000 UTC m=+150.523883222" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.296930 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wpkbq" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.346307 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:26 crc kubenswrapper[4594]: E1129 05:30:26.347002 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:26.846985068 +0000 UTC m=+151.087494289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.365731 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrhzj" podStartSLOduration=129.365715958 podStartE2EDuration="2m9.365715958s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.320094567 +0000 UTC m=+150.560603787" watchObservedRunningTime="2025-11-29 05:30:26.365715958 +0000 UTC m=+150.606225179" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.419631 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gspdt" podStartSLOduration=129.419616657 podStartE2EDuration="2m9.419616657s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.418942116 +0000 UTC m=+150.659451336" watchObservedRunningTime="2025-11-29 05:30:26.419616657 +0000 UTC m=+150.660125877" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.420623 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wpkbq" podStartSLOduration=129.420616986 podStartE2EDuration="2m9.420616986s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.39997306 +0000 UTC m=+150.640482280" watchObservedRunningTime="2025-11-29 05:30:26.420616986 +0000 UTC m=+150.661126207" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.448033 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:26 crc kubenswrapper[4594]: E1129 05:30:26.448315 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:26.948303937 +0000 UTC m=+151.188813157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.459796 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" podStartSLOduration=129.459782532 podStartE2EDuration="2m9.459782532s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.458587959 +0000 UTC m=+150.699097179" watchObservedRunningTime="2025-11-29 05:30:26.459782532 +0000 UTC m=+150.700291752" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.484226 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zqmhf" podStartSLOduration=129.484213146 podStartE2EDuration="2m9.484213146s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.4831823 +0000 UTC m=+150.723691519" watchObservedRunningTime="2025-11-29 05:30:26.484213146 +0000 UTC m=+150.724722365" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.553566 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:26 crc kubenswrapper[4594]: E1129 05:30:26.553943 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.053930267 +0000 UTC m=+151.294439487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.586606 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-gv8n5" podStartSLOduration=129.586592637 podStartE2EDuration="2m9.586592637s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.584308257 +0000 UTC m=+150.824817477" watchObservedRunningTime="2025-11-29 05:30:26.586592637 +0000 UTC m=+150.827101857" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.661242 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:26 crc kubenswrapper[4594]: E1129 05:30:26.661569 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.16155976 +0000 UTC m=+151.402068981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.663224 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9rtk" podStartSLOduration=129.663212539 podStartE2EDuration="2m9.663212539s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.614369165 +0000 UTC m=+150.854878385" watchObservedRunningTime="2025-11-29 05:30:26.663212539 +0000 UTC m=+150.903721759" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.667268 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.675949 4594 patch_prober.go:28] interesting pod/router-default-5444994796-8pvtn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 05:30:26 crc kubenswrapper[4594]: [-]has-synced failed: reason withheld Nov 29 05:30:26 crc kubenswrapper[4594]: [+]process-running ok Nov 29 05:30:26 crc kubenswrapper[4594]: healthz check failed Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.675982 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8pvtn" podUID="c3e4199d-2fe7-4039-8f59-b8bab2beebd0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.693618 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" podStartSLOduration=129.693601811 podStartE2EDuration="2m9.693601811s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:26.664418754 +0000 UTC m=+150.904927974" watchObservedRunningTime="2025-11-29 05:30:26.693601811 +0000 UTC m=+150.934111031" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.762079 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:26 crc kubenswrapper[4594]: E1129 05:30:26.762430 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.262416154 +0000 UTC m=+151.502925374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.815592 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-95k6p" Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.864887 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:26 crc kubenswrapper[4594]: E1129 05:30:26.865197 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.365184933 +0000 UTC m=+151.605694153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.966345 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:26 crc kubenswrapper[4594]: E1129 05:30:26.966482 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.466466211 +0000 UTC m=+151.706975431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:26 crc kubenswrapper[4594]: I1129 05:30:26.966881 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:26 crc kubenswrapper[4594]: E1129 05:30:26.967296 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.467271475 +0000 UTC m=+151.707780695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.067931 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.068185 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.568162855 +0000 UTC m=+151.808672075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.068263 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.068605 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.568594562 +0000 UTC m=+151.809103782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.164909 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v" event={"ID":"637efb26-f3b6-4db6-b0bc-d1ea0dcd85a5","Type":"ContainerStarted","Data":"091e50cd27fb52a43df5b61fa5909b6630232fe7656211493855818a0baaf544"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.166962 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9rtk" event={"ID":"d68c506f-d0c1-4889-ab10-b886ef7880a7","Type":"ContainerStarted","Data":"a56f8e103e2fa00088a9825f0bc931a61f500a8f785b65cb9b361bc9b2fe66cb"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.166993 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g9rtk" event={"ID":"d68c506f-d0c1-4889-ab10-b886ef7880a7","Type":"ContainerStarted","Data":"2ee10038d3cf6074299be86050239ac9f5e9e54c78f23a5dfac3f6b764234e05"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.169070 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qkgqz" event={"ID":"9e3c263e-0845-4d3e-865f-e778b9ff3e53","Type":"ContainerStarted","Data":"bbb98a121b15754358f162d7b3d2bac590f6ffe1ea305b4c068fd77c73f6359c"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.169490 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.169655 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.669638416 +0000 UTC m=+151.910147636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.170347 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.170411 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.670396853 +0000 UTC m=+151.910906074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.192608 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" event={"ID":"81534f91-740b-487b-9149-e8565ccf9905","Type":"ContainerStarted","Data":"c52c3da5ba833a9bca5cd6253a67a39e885eed1a1246748c024f7a776c010629"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.202374 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gv8n5" event={"ID":"6b9af1ca-d5ac-4278-8e23-510ddfc2ea8c","Type":"ContainerStarted","Data":"d7b958effc6bad7d99ee9e6e9af520631467d0a8320283fed753723473473c43"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.210849 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" event={"ID":"b8a99cf8-64bd-4e83-b77e-c4358154f10e","Type":"ContainerStarted","Data":"ca7068d481160bf244d8361a10914d18d59a3bbcc8137b1d9a7759cd6bd23e94"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.215849 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" podStartSLOduration=130.215840083 podStartE2EDuration="2m10.215840083s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:27.215245632 +0000 UTC m=+151.455754853" watchObservedRunningTime="2025-11-29 05:30:27.215840083 +0000 UTC m=+151.456349303" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.215985 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-khv8v" podStartSLOduration=130.215981928 podStartE2EDuration="2m10.215981928s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:27.19214224 +0000 UTC m=+151.432651459" watchObservedRunningTime="2025-11-29 05:30:27.215981928 +0000 UTC m=+151.456491149" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.224976 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gzzjt" event={"ID":"837571cc-8fd2-4102-9997-4520ddf6da08","Type":"ContainerStarted","Data":"8c063b07f761cfc79dab314e0915976657553c20593499b28bec3f90c2f32d5a"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.226602 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wq549" event={"ID":"52daf44c-8b94-4d9c-8fcb-eeb3d8fd286f","Type":"ContainerStarted","Data":"b6212d7c9a650a45d6c6e30570b522be070a430eed337a5777e041d51b5d1d6a"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.229285 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.230359 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.230524 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" event={"ID":"bf5ed613-ce39-4081-8342-aef24fb3a385","Type":"ContainerStarted","Data":"f4751601193e6591d2cc244a4fa52fd87cefa51794ab71fd761990927d40b111"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.232853 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" event={"ID":"f9a409f3-6983-4162-9c32-355020bc1c3a","Type":"ContainerStarted","Data":"f8cd40782b969c36ff4d9b44b672d061d732ed040719d65dc40d935982546b2a"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.233188 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.235045 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wqr7v" podStartSLOduration=130.235038868 podStartE2EDuration="2m10.235038868s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:27.234091648 +0000 UTC m=+151.474600867" watchObservedRunningTime="2025-11-29 05:30:27.235038868 +0000 UTC m=+151.475548088" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.235868 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zp2th" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.236690 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.238213 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pjh6b" event={"ID":"05218cdc-d6bb-47ef-aab3-9a5e0dab68b8","Type":"ContainerStarted","Data":"0a80c17a3805db379422b0e11c5b622e3a11b06975ca9236af26371404297b95"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.239468 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" event={"ID":"81d6a548-1be4-41a1-8972-ed027e6895aa","Type":"ContainerStarted","Data":"b0c3fd3dca516ee02a4705c489de1722c22590ed51740aee970eb68defcc7a8c"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.239499 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" event={"ID":"81d6a548-1be4-41a1-8972-ed027e6895aa","Type":"ContainerStarted","Data":"5a7c497a6af035d0fec34185103af137409b43ac596b9580d006f0860a3d87e7"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.252058 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" event={"ID":"873587ac-5d12-45c4-bfd5-42bc08d29f65","Type":"ContainerStarted","Data":"1d4ef97b716155fb74b91f6b146b97c09769e9e78b7723937800c1c6926b0f7f"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.252082 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" event={"ID":"873587ac-5d12-45c4-bfd5-42bc08d29f65","Type":"ContainerStarted","Data":"a74a020116a4cdae92611f3ccfa0d5ab87e65e474bd6fcf618481189f38bbc91"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.254223 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g4pv8" event={"ID":"d1bc9264-d190-49c0-9122-7cc123dd88e5","Type":"ContainerStarted","Data":"951973f5fcce85bf9b4ec46cd139fd412256f228f7509fdabb0d840445107a85"} Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.256941 4594 patch_prober.go:28] interesting pod/downloads-7954f5f757-xg8s7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.257008 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xg8s7" podUID="6c685741-a472-49fe-8237-520bea0232ef" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.257739 4594 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-k9wmb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.257837 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" podUID="c584b692-db15-4724-a208-491507a8474e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.266622 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" podStartSLOduration=130.266595532 podStartE2EDuration="2m10.266595532s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:27.260846135 +0000 UTC m=+151.501355355" watchObservedRunningTime="2025-11-29 05:30:27.266595532 +0000 UTC m=+151.507104753" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.271957 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.272187 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.772154322 +0000 UTC m=+152.012663542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.272357 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.272968 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.7729562 +0000 UTC m=+152.013465421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.326185 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-pjh6b" podStartSLOduration=130.326171347 podStartE2EDuration="2m10.326171347s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:27.298029145 +0000 UTC m=+151.538538356" watchObservedRunningTime="2025-11-29 05:30:27.326171347 +0000 UTC m=+151.566680567" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.355101 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" podStartSLOduration=130.355084619 podStartE2EDuration="2m10.355084619s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:27.327397468 +0000 UTC m=+151.567906689" watchObservedRunningTime="2025-11-29 05:30:27.355084619 +0000 UTC m=+151.595593840" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.373359 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.373519 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.873503367 +0000 UTC m=+152.114012587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.374057 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.374640 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.874620935 +0000 UTC m=+152.115130155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.389497 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-g4pv8" podStartSLOduration=8.389479949 podStartE2EDuration="8.389479949s" podCreationTimestamp="2025-11-29 05:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:27.37873776 +0000 UTC m=+151.619246980" watchObservedRunningTime="2025-11-29 05:30:27.389479949 +0000 UTC m=+151.629989170" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.478497 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.478902 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:27.978873708 +0000 UTC m=+152.219382928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.483405 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.483656 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.484558 4594 patch_prober.go:28] interesting pod/apiserver-76f77b778f-jgzzn container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.484588 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" podUID="873587ac-5d12-45c4-bfd5-42bc08d29f65" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.519841 4594 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.579574 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.579841 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:28.079830039 +0000 UTC m=+152.320339258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.661342 4594 patch_prober.go:28] interesting pod/router-default-5444994796-8pvtn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 05:30:27 crc kubenswrapper[4594]: [-]has-synced failed: reason withheld Nov 29 05:30:27 crc kubenswrapper[4594]: [+]process-running ok Nov 29 05:30:27 crc kubenswrapper[4594]: healthz check failed Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.661783 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8pvtn" podUID="c3e4199d-2fe7-4039-8f59-b8bab2beebd0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.680733 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.681081 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:28.181064539 +0000 UTC m=+152.421573759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.782168 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.782539 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:28.282526725 +0000 UTC m=+152.523035946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.883201 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.883357 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:28.383333818 +0000 UTC m=+152.623843038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.883669 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.883988 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:28.383979995 +0000 UTC m=+152.624489216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.985074 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.985311 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:28.485281801 +0000 UTC m=+152.725791011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:27 crc kubenswrapper[4594]: I1129 05:30:27.985465 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:27 crc kubenswrapper[4594]: E1129 05:30:27.985749 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:28.48573544 +0000 UTC m=+152.726244660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.086287 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:28 crc kubenswrapper[4594]: E1129 05:30:28.086497 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 05:30:28.586459476 +0000 UTC m=+152.826968696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.188903 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:28 crc kubenswrapper[4594]: E1129 05:30:28.189522 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 05:30:28.689500062 +0000 UTC m=+152.930009282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-chxbx" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.218336 4594 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-29T05:30:27.519859916Z","Handler":null,"Name":""} Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.220474 4594 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.220512 4594 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.260010 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" event={"ID":"81d6a548-1be4-41a1-8972-ed027e6895aa","Type":"ContainerStarted","Data":"92fc0832410f1ab8412afcf63f189950176fb2bed82603a8069fbf14e802114c"} Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.260061 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" event={"ID":"81d6a548-1be4-41a1-8972-ed027e6895aa","Type":"ContainerStarted","Data":"4678f732e8353e4d8a78a89dda7cb1c0f53e3993b69a98d8909c2105ece8464c"} Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.263174 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.264322 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.264936 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g9dnn" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.290338 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.294088 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.296011 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-pmjzq" podStartSLOduration=9.295991188 podStartE2EDuration="9.295991188s" podCreationTimestamp="2025-11-29 05:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:28.282372824 +0000 UTC m=+152.522882044" watchObservedRunningTime="2025-11-29 05:30:28.295991188 +0000 UTC m=+152.536500408" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.394456 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.398814 4594 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.398852 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.427414 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-chxbx\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.444177 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.591233 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pzf9l"] Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.593769 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.598870 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.605538 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pzf9l"] Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.662694 4594 patch_prober.go:28] interesting pod/router-default-5444994796-8pvtn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 05:30:28 crc kubenswrapper[4594]: [-]has-synced failed: reason withheld Nov 29 05:30:28 crc kubenswrapper[4594]: [+]process-running ok Nov 29 05:30:28 crc kubenswrapper[4594]: healthz check failed Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.662733 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8pvtn" podUID="c3e4199d-2fe7-4039-8f59-b8bab2beebd0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.704536 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-utilities\") pod \"certified-operators-pzf9l\" (UID: \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\") " pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.704687 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-catalog-content\") pod \"certified-operators-pzf9l\" (UID: \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\") " pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.704736 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgdcd\" (UniqueName: \"kubernetes.io/projected/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-kube-api-access-bgdcd\") pod \"certified-operators-pzf9l\" (UID: \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\") " pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.726492 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-chxbx"] Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.779518 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zrcl4"] Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.780439 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.782603 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.791549 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrcl4"] Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.806224 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-utilities\") pod \"certified-operators-pzf9l\" (UID: \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\") " pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.806384 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-catalog-content\") pod \"certified-operators-pzf9l\" (UID: \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\") " pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.806509 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgdcd\" (UniqueName: \"kubernetes.io/projected/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-kube-api-access-bgdcd\") pod \"certified-operators-pzf9l\" (UID: \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\") " pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.806677 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-utilities\") pod \"certified-operators-pzf9l\" (UID: \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\") " pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.806847 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-catalog-content\") pod \"certified-operators-pzf9l\" (UID: \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\") " pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.823859 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgdcd\" (UniqueName: \"kubernetes.io/projected/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-kube-api-access-bgdcd\") pod \"certified-operators-pzf9l\" (UID: \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\") " pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.907128 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a1effb-361a-4fcf-b9f1-f32692c38587-catalog-content\") pod \"community-operators-zrcl4\" (UID: \"37a1effb-361a-4fcf-b9f1-f32692c38587\") " pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.907210 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a1effb-361a-4fcf-b9f1-f32692c38587-utilities\") pod \"community-operators-zrcl4\" (UID: \"37a1effb-361a-4fcf-b9f1-f32692c38587\") " pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.907288 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxd2s\" (UniqueName: \"kubernetes.io/projected/37a1effb-361a-4fcf-b9f1-f32692c38587-kube-api-access-sxd2s\") pod \"community-operators-zrcl4\" (UID: \"37a1effb-361a-4fcf-b9f1-f32692c38587\") " pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:28 crc kubenswrapper[4594]: I1129 05:30:28.921451 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.002991 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z7jx7"] Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.008608 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a1effb-361a-4fcf-b9f1-f32692c38587-utilities\") pod \"community-operators-zrcl4\" (UID: \"37a1effb-361a-4fcf-b9f1-f32692c38587\") " pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.008673 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxd2s\" (UniqueName: \"kubernetes.io/projected/37a1effb-361a-4fcf-b9f1-f32692c38587-kube-api-access-sxd2s\") pod \"community-operators-zrcl4\" (UID: \"37a1effb-361a-4fcf-b9f1-f32692c38587\") " pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.008723 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a1effb-361a-4fcf-b9f1-f32692c38587-catalog-content\") pod \"community-operators-zrcl4\" (UID: \"37a1effb-361a-4fcf-b9f1-f32692c38587\") " pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.008928 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z7jx7"] Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.009074 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.009207 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a1effb-361a-4fcf-b9f1-f32692c38587-catalog-content\") pod \"community-operators-zrcl4\" (UID: \"37a1effb-361a-4fcf-b9f1-f32692c38587\") " pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.009344 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a1effb-361a-4fcf-b9f1-f32692c38587-utilities\") pod \"community-operators-zrcl4\" (UID: \"37a1effb-361a-4fcf-b9f1-f32692c38587\") " pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.032107 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxd2s\" (UniqueName: \"kubernetes.io/projected/37a1effb-361a-4fcf-b9f1-f32692c38587-kube-api-access-sxd2s\") pod \"community-operators-zrcl4\" (UID: \"37a1effb-361a-4fcf-b9f1-f32692c38587\") " pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.107431 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.109990 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b771218-fa00-44fe-8cb7-35abccdf9e10-utilities\") pod \"certified-operators-z7jx7\" (UID: \"2b771218-fa00-44fe-8cb7-35abccdf9e10\") " pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.110057 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b771218-fa00-44fe-8cb7-35abccdf9e10-catalog-content\") pod \"certified-operators-z7jx7\" (UID: \"2b771218-fa00-44fe-8cb7-35abccdf9e10\") " pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.110159 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4qx7\" (UniqueName: \"kubernetes.io/projected/2b771218-fa00-44fe-8cb7-35abccdf9e10-kube-api-access-c4qx7\") pod \"certified-operators-z7jx7\" (UID: \"2b771218-fa00-44fe-8cb7-35abccdf9e10\") " pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.117114 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pzf9l"] Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.178155 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6pg58"] Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.179403 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.188323 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6pg58"] Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.214684 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4qx7\" (UniqueName: \"kubernetes.io/projected/2b771218-fa00-44fe-8cb7-35abccdf9e10-kube-api-access-c4qx7\") pod \"certified-operators-z7jx7\" (UID: \"2b771218-fa00-44fe-8cb7-35abccdf9e10\") " pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.214756 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b771218-fa00-44fe-8cb7-35abccdf9e10-utilities\") pod \"certified-operators-z7jx7\" (UID: \"2b771218-fa00-44fe-8cb7-35abccdf9e10\") " pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.214794 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b771218-fa00-44fe-8cb7-35abccdf9e10-catalog-content\") pod \"certified-operators-z7jx7\" (UID: \"2b771218-fa00-44fe-8cb7-35abccdf9e10\") " pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.215291 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b771218-fa00-44fe-8cb7-35abccdf9e10-utilities\") pod \"certified-operators-z7jx7\" (UID: \"2b771218-fa00-44fe-8cb7-35abccdf9e10\") " pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.215312 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b771218-fa00-44fe-8cb7-35abccdf9e10-catalog-content\") pod \"certified-operators-z7jx7\" (UID: \"2b771218-fa00-44fe-8cb7-35abccdf9e10\") " pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.233927 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4qx7\" (UniqueName: \"kubernetes.io/projected/2b771218-fa00-44fe-8cb7-35abccdf9e10-kube-api-access-c4qx7\") pod \"certified-operators-z7jx7\" (UID: \"2b771218-fa00-44fe-8cb7-35abccdf9e10\") " pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.259663 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.260900 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.264818 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.264992 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.265241 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.273048 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" event={"ID":"0945b059-dd2b-4550-8b8a-49ea4e94ffeb","Type":"ContainerStarted","Data":"a35fea000c510be2f158dc805929a5e48550046a16d8fec2a100dc5eed9250ca"} Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.273105 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" event={"ID":"0945b059-dd2b-4550-8b8a-49ea4e94ffeb","Type":"ContainerStarted","Data":"0909f865542aa26ef26c7070d0c942b66efa64e42e0f14ac0e800c78fdb025a0"} Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.273386 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.276948 4594 generic.go:334] "Generic (PLEG): container finished" podID="7ec39acd-e54b-4bf5-99e7-dae22ecfceda" containerID="7f7e5489c6ab32c0b9d48a930698798fd30d71cc5150bd422ecf68fbabf67b87" exitCode=0 Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.277027 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzf9l" event={"ID":"7ec39acd-e54b-4bf5-99e7-dae22ecfceda","Type":"ContainerDied","Data":"7f7e5489c6ab32c0b9d48a930698798fd30d71cc5150bd422ecf68fbabf67b87"} Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.277046 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzf9l" event={"ID":"7ec39acd-e54b-4bf5-99e7-dae22ecfceda","Type":"ContainerStarted","Data":"29eb2eb197fa56bd424a32c7f5f969c581c17e59531904cf7486b0efbddbc921"} Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.279614 4594 generic.go:334] "Generic (PLEG): container finished" podID="b5059cab-0e22-479f-9079-b031f405e547" containerID="8eae648ce95bfb31bd79da0a697d004da9431becd62d6f6dddac172b6a954eab" exitCode=0 Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.280312 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" event={"ID":"b5059cab-0e22-479f-9079-b031f405e547","Type":"ContainerDied","Data":"8eae648ce95bfb31bd79da0a697d004da9431becd62d6f6dddac172b6a954eab"} Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.280878 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.282074 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrcl4"] Nov 29 05:30:29 crc kubenswrapper[4594]: W1129 05:30:29.314876 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a1effb_361a_4fcf_b9f1_f32692c38587.slice/crio-a5cd9c879b773898be6410551e66b352f57f2b1e96f06ae2185d9bd6a7aacbd9 WatchSource:0}: Error finding container a5cd9c879b773898be6410551e66b352f57f2b1e96f06ae2185d9bd6a7aacbd9: Status 404 returned error can't find the container with id a5cd9c879b773898be6410551e66b352f57f2b1e96f06ae2185d9bd6a7aacbd9 Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.315463 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-catalog-content\") pod \"community-operators-6pg58\" (UID: \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\") " pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.315495 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh5jl\" (UniqueName: \"kubernetes.io/projected/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-kube-api-access-xh5jl\") pod \"community-operators-6pg58\" (UID: \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\") " pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.315527 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-utilities\") pod \"community-operators-6pg58\" (UID: \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\") " pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.325567 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" podStartSLOduration=132.325554078 podStartE2EDuration="2m12.325554078s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:29.296789062 +0000 UTC m=+153.537298282" watchObservedRunningTime="2025-11-29 05:30:29.325554078 +0000 UTC m=+153.566063298" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.339382 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.416923 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-catalog-content\") pod \"community-operators-6pg58\" (UID: \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\") " pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.417009 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh5jl\" (UniqueName: \"kubernetes.io/projected/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-kube-api-access-xh5jl\") pod \"community-operators-6pg58\" (UID: \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\") " pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.417069 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d838adaa-2be0-45a8-9252-c76cfbadb810-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d838adaa-2be0-45a8-9252-c76cfbadb810\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.417177 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-utilities\") pod \"community-operators-6pg58\" (UID: \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\") " pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.417294 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d838adaa-2be0-45a8-9252-c76cfbadb810-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d838adaa-2be0-45a8-9252-c76cfbadb810\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.418664 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-catalog-content\") pod \"community-operators-6pg58\" (UID: \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\") " pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.422908 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-utilities\") pod \"community-operators-6pg58\" (UID: \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\") " pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.445850 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh5jl\" (UniqueName: \"kubernetes.io/projected/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-kube-api-access-xh5jl\") pod \"community-operators-6pg58\" (UID: \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\") " pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.518160 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d838adaa-2be0-45a8-9252-c76cfbadb810-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d838adaa-2be0-45a8-9252-c76cfbadb810\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.518505 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d838adaa-2be0-45a8-9252-c76cfbadb810-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d838adaa-2be0-45a8-9252-c76cfbadb810\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.518581 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d838adaa-2be0-45a8-9252-c76cfbadb810-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d838adaa-2be0-45a8-9252-c76cfbadb810\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.519915 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.536457 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d838adaa-2be0-45a8-9252-c76cfbadb810-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d838adaa-2be0-45a8-9252-c76cfbadb810\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.539827 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z7jx7"] Nov 29 05:30:29 crc kubenswrapper[4594]: W1129 05:30:29.556298 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b771218_fa00_44fe_8cb7_35abccdf9e10.slice/crio-6aa146866aeb08c049a246bf684ff995fc493dab31b016b1d4176fdae83417cb WatchSource:0}: Error finding container 6aa146866aeb08c049a246bf684ff995fc493dab31b016b1d4176fdae83417cb: Status 404 returned error can't find the container with id 6aa146866aeb08c049a246bf684ff995fc493dab31b016b1d4176fdae83417cb Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.623278 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.658938 4594 patch_prober.go:28] interesting pod/router-default-5444994796-8pvtn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 05:30:29 crc kubenswrapper[4594]: [-]has-synced failed: reason withheld Nov 29 05:30:29 crc kubenswrapper[4594]: [+]process-running ok Nov 29 05:30:29 crc kubenswrapper[4594]: healthz check failed Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.659858 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8pvtn" podUID="c3e4199d-2fe7-4039-8f59-b8bab2beebd0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.669840 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6pg58"] Nov 29 05:30:29 crc kubenswrapper[4594]: I1129 05:30:29.775729 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 29 05:30:29 crc kubenswrapper[4594]: W1129 05:30:29.780993 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd838adaa_2be0_45a8_9252_c76cfbadb810.slice/crio-c1b765ead6d9a914e91992622f46e0440f9fe1891e597cf6bbd8ade7a05f58fb WatchSource:0}: Error finding container c1b765ead6d9a914e91992622f46e0440f9fe1891e597cf6bbd8ade7a05f58fb: Status 404 returned error can't find the container with id c1b765ead6d9a914e91992622f46e0440f9fe1891e597cf6bbd8ade7a05f58fb Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.092179 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.289195 4594 generic.go:334] "Generic (PLEG): container finished" podID="2b771218-fa00-44fe-8cb7-35abccdf9e10" containerID="63a4e1f743175da576fafce9720a2bc6958f045124c0d7b78f9d7ec8a028724e" exitCode=0 Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.289554 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7jx7" event={"ID":"2b771218-fa00-44fe-8cb7-35abccdf9e10","Type":"ContainerDied","Data":"63a4e1f743175da576fafce9720a2bc6958f045124c0d7b78f9d7ec8a028724e"} Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.289616 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7jx7" event={"ID":"2b771218-fa00-44fe-8cb7-35abccdf9e10","Type":"ContainerStarted","Data":"6aa146866aeb08c049a246bf684ff995fc493dab31b016b1d4176fdae83417cb"} Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.295621 4594 generic.go:334] "Generic (PLEG): container finished" podID="37a1effb-361a-4fcf-b9f1-f32692c38587" containerID="79e628f49a0f5d60d235fa9b9578e796c849a5ed0e7ba33928b7ae59e4bc3def" exitCode=0 Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.295692 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcl4" event={"ID":"37a1effb-361a-4fcf-b9f1-f32692c38587","Type":"ContainerDied","Data":"79e628f49a0f5d60d235fa9b9578e796c849a5ed0e7ba33928b7ae59e4bc3def"} Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.295723 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcl4" event={"ID":"37a1effb-361a-4fcf-b9f1-f32692c38587","Type":"ContainerStarted","Data":"a5cd9c879b773898be6410551e66b352f57f2b1e96f06ae2185d9bd6a7aacbd9"} Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.297391 4594 generic.go:334] "Generic (PLEG): container finished" podID="96d7568a-7c09-407d-b2f4-fa4110cb7ecd" containerID="5bdb8e021b028922ce5790dbbd6484aea7ecee4b8332deddde7eaf0f7873f260" exitCode=0 Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.297575 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6pg58" event={"ID":"96d7568a-7c09-407d-b2f4-fa4110cb7ecd","Type":"ContainerDied","Data":"5bdb8e021b028922ce5790dbbd6484aea7ecee4b8332deddde7eaf0f7873f260"} Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.297637 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6pg58" event={"ID":"96d7568a-7c09-407d-b2f4-fa4110cb7ecd","Type":"ContainerStarted","Data":"78b038af0488da2f9ffdc40111cfac61bfc7d771b702f5ed4ec58ea1fee12304"} Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.301671 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d838adaa-2be0-45a8-9252-c76cfbadb810","Type":"ContainerStarted","Data":"dd49bed3bef5bcf8f58a3014d202306183994fb6127ba858c147e058555b4d62"} Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.301717 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d838adaa-2be0-45a8-9252-c76cfbadb810","Type":"ContainerStarted","Data":"c1b765ead6d9a914e91992622f46e0440f9fe1891e597cf6bbd8ade7a05f58fb"} Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.345279 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.345248281 podStartE2EDuration="1.345248281s" podCreationTimestamp="2025-11-29 05:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:30.34333737 +0000 UTC m=+154.583846590" watchObservedRunningTime="2025-11-29 05:30:30.345248281 +0000 UTC m=+154.585757501" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.504924 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.634119 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m65qz\" (UniqueName: \"kubernetes.io/projected/b5059cab-0e22-479f-9079-b031f405e547-kube-api-access-m65qz\") pod \"b5059cab-0e22-479f-9079-b031f405e547\" (UID: \"b5059cab-0e22-479f-9079-b031f405e547\") " Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.634531 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5059cab-0e22-479f-9079-b031f405e547-secret-volume\") pod \"b5059cab-0e22-479f-9079-b031f405e547\" (UID: \"b5059cab-0e22-479f-9079-b031f405e547\") " Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.634565 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5059cab-0e22-479f-9079-b031f405e547-config-volume\") pod \"b5059cab-0e22-479f-9079-b031f405e547\" (UID: \"b5059cab-0e22-479f-9079-b031f405e547\") " Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.635139 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5059cab-0e22-479f-9079-b031f405e547-config-volume" (OuterVolumeSpecName: "config-volume") pod "b5059cab-0e22-479f-9079-b031f405e547" (UID: "b5059cab-0e22-479f-9079-b031f405e547"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.640798 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5059cab-0e22-479f-9079-b031f405e547-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b5059cab-0e22-479f-9079-b031f405e547" (UID: "b5059cab-0e22-479f-9079-b031f405e547"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.640858 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5059cab-0e22-479f-9079-b031f405e547-kube-api-access-m65qz" (OuterVolumeSpecName: "kube-api-access-m65qz") pod "b5059cab-0e22-479f-9079-b031f405e547" (UID: "b5059cab-0e22-479f-9079-b031f405e547"). InnerVolumeSpecName "kube-api-access-m65qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.658980 4594 patch_prober.go:28] interesting pod/router-default-5444994796-8pvtn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 05:30:30 crc kubenswrapper[4594]: [-]has-synced failed: reason withheld Nov 29 05:30:30 crc kubenswrapper[4594]: [+]process-running ok Nov 29 05:30:30 crc kubenswrapper[4594]: healthz check failed Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.659597 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8pvtn" podUID="c3e4199d-2fe7-4039-8f59-b8bab2beebd0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.736354 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m65qz\" (UniqueName: \"kubernetes.io/projected/b5059cab-0e22-479f-9079-b031f405e547-kube-api-access-m65qz\") on node \"crc\" DevicePath \"\"" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.736383 4594 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5059cab-0e22-479f-9079-b031f405e547-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.736392 4594 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5059cab-0e22-479f-9079-b031f405e547-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.779999 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bvj7k"] Nov 29 05:30:30 crc kubenswrapper[4594]: E1129 05:30:30.780228 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5059cab-0e22-479f-9079-b031f405e547" containerName="collect-profiles" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.780246 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5059cab-0e22-479f-9079-b031f405e547" containerName="collect-profiles" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.780392 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5059cab-0e22-479f-9079-b031f405e547" containerName="collect-profiles" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.781815 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.783460 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.784722 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvj7k"] Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.938973 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-utilities\") pod \"redhat-marketplace-bvj7k\" (UID: \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\") " pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.939040 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-catalog-content\") pod \"redhat-marketplace-bvj7k\" (UID: \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\") " pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:30 crc kubenswrapper[4594]: I1129 05:30:30.939080 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f8w8\" (UniqueName: \"kubernetes.io/projected/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-kube-api-access-9f8w8\") pod \"redhat-marketplace-bvj7k\" (UID: \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\") " pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.040139 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-utilities\") pod \"redhat-marketplace-bvj7k\" (UID: \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\") " pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.040181 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-catalog-content\") pod \"redhat-marketplace-bvj7k\" (UID: \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\") " pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.040211 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f8w8\" (UniqueName: \"kubernetes.io/projected/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-kube-api-access-9f8w8\") pod \"redhat-marketplace-bvj7k\" (UID: \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\") " pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.040873 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-catalog-content\") pod \"redhat-marketplace-bvj7k\" (UID: \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\") " pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.040990 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-utilities\") pod \"redhat-marketplace-bvj7k\" (UID: \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\") " pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.055418 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f8w8\" (UniqueName: \"kubernetes.io/projected/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-kube-api-access-9f8w8\") pod \"redhat-marketplace-bvj7k\" (UID: \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\") " pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.105548 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.179386 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mk4pz"] Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.180732 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.193311 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mk4pz"] Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.316487 4594 generic.go:334] "Generic (PLEG): container finished" podID="d838adaa-2be0-45a8-9252-c76cfbadb810" containerID="dd49bed3bef5bcf8f58a3014d202306183994fb6127ba858c147e058555b4d62" exitCode=0 Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.316723 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d838adaa-2be0-45a8-9252-c76cfbadb810","Type":"ContainerDied","Data":"dd49bed3bef5bcf8f58a3014d202306183994fb6127ba858c147e058555b4d62"} Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.327679 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvj7k"] Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.328369 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" event={"ID":"b5059cab-0e22-479f-9079-b031f405e547","Type":"ContainerDied","Data":"130081b2918769b1e6214a194301b433b5d996686cc35a5e69e4e2f004e0084c"} Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.328400 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="130081b2918769b1e6214a194301b433b5d996686cc35a5e69e4e2f004e0084c" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.328522 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.343491 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/233cee4b-1c92-42c9-b42e-b2200d4fd842-catalog-content\") pod \"redhat-marketplace-mk4pz\" (UID: \"233cee4b-1c92-42c9-b42e-b2200d4fd842\") " pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.343599 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/233cee4b-1c92-42c9-b42e-b2200d4fd842-utilities\") pod \"redhat-marketplace-mk4pz\" (UID: \"233cee4b-1c92-42c9-b42e-b2200d4fd842\") " pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.343627 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r47t5\" (UniqueName: \"kubernetes.io/projected/233cee4b-1c92-42c9-b42e-b2200d4fd842-kube-api-access-r47t5\") pod \"redhat-marketplace-mk4pz\" (UID: \"233cee4b-1c92-42c9-b42e-b2200d4fd842\") " pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.444329 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/233cee4b-1c92-42c9-b42e-b2200d4fd842-utilities\") pod \"redhat-marketplace-mk4pz\" (UID: \"233cee4b-1c92-42c9-b42e-b2200d4fd842\") " pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.444364 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r47t5\" (UniqueName: \"kubernetes.io/projected/233cee4b-1c92-42c9-b42e-b2200d4fd842-kube-api-access-r47t5\") pod \"redhat-marketplace-mk4pz\" (UID: \"233cee4b-1c92-42c9-b42e-b2200d4fd842\") " pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.444418 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/233cee4b-1c92-42c9-b42e-b2200d4fd842-catalog-content\") pod \"redhat-marketplace-mk4pz\" (UID: \"233cee4b-1c92-42c9-b42e-b2200d4fd842\") " pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.445111 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/233cee4b-1c92-42c9-b42e-b2200d4fd842-utilities\") pod \"redhat-marketplace-mk4pz\" (UID: \"233cee4b-1c92-42c9-b42e-b2200d4fd842\") " pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.445159 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/233cee4b-1c92-42c9-b42e-b2200d4fd842-catalog-content\") pod \"redhat-marketplace-mk4pz\" (UID: \"233cee4b-1c92-42c9-b42e-b2200d4fd842\") " pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.465069 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r47t5\" (UniqueName: \"kubernetes.io/projected/233cee4b-1c92-42c9-b42e-b2200d4fd842-kube-api-access-r47t5\") pod \"redhat-marketplace-mk4pz\" (UID: \"233cee4b-1c92-42c9-b42e-b2200d4fd842\") " pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.504156 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.659139 4594 patch_prober.go:28] interesting pod/router-default-5444994796-8pvtn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 05:30:31 crc kubenswrapper[4594]: [-]has-synced failed: reason withheld Nov 29 05:30:31 crc kubenswrapper[4594]: [+]process-running ok Nov 29 05:30:31 crc kubenswrapper[4594]: healthz check failed Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.659214 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8pvtn" podUID="c3e4199d-2fe7-4039-8f59-b8bab2beebd0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.709462 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mk4pz"] Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.785688 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8ptq2"] Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.787116 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.787767 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8ptq2"] Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.788930 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.918917 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.918962 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.920333 4594 patch_prober.go:28] interesting pod/console-f9d7485db-9ctf8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.920378 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9ctf8" podUID="12b5360d-755a-4cb5-9ef3-0c00550e3913" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.951094 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b3dc9b-0c66-4af2-977d-9264c559a827-utilities\") pod \"redhat-operators-8ptq2\" (UID: \"c9b3dc9b-0c66-4af2-977d-9264c559a827\") " pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.951132 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpcbv\" (UniqueName: \"kubernetes.io/projected/c9b3dc9b-0c66-4af2-977d-9264c559a827-kube-api-access-zpcbv\") pod \"redhat-operators-8ptq2\" (UID: \"c9b3dc9b-0c66-4af2-977d-9264c559a827\") " pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.951153 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b3dc9b-0c66-4af2-977d-9264c559a827-catalog-content\") pod \"redhat-operators-8ptq2\" (UID: \"c9b3dc9b-0c66-4af2-977d-9264c559a827\") " pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:30:31 crc kubenswrapper[4594]: I1129 05:30:31.952106 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-xg8s7" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.052171 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b3dc9b-0c66-4af2-977d-9264c559a827-utilities\") pod \"redhat-operators-8ptq2\" (UID: \"c9b3dc9b-0c66-4af2-977d-9264c559a827\") " pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.052221 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpcbv\" (UniqueName: \"kubernetes.io/projected/c9b3dc9b-0c66-4af2-977d-9264c559a827-kube-api-access-zpcbv\") pod \"redhat-operators-8ptq2\" (UID: \"c9b3dc9b-0c66-4af2-977d-9264c559a827\") " pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.052265 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b3dc9b-0c66-4af2-977d-9264c559a827-catalog-content\") pod \"redhat-operators-8ptq2\" (UID: \"c9b3dc9b-0c66-4af2-977d-9264c559a827\") " pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.056689 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b3dc9b-0c66-4af2-977d-9264c559a827-utilities\") pod \"redhat-operators-8ptq2\" (UID: \"c9b3dc9b-0c66-4af2-977d-9264c559a827\") " pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.056808 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b3dc9b-0c66-4af2-977d-9264c559a827-catalog-content\") pod \"redhat-operators-8ptq2\" (UID: \"c9b3dc9b-0c66-4af2-977d-9264c559a827\") " pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.074237 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpcbv\" (UniqueName: \"kubernetes.io/projected/c9b3dc9b-0c66-4af2-977d-9264c559a827-kube-api-access-zpcbv\") pod \"redhat-operators-8ptq2\" (UID: \"c9b3dc9b-0c66-4af2-977d-9264c559a827\") " pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.104180 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.178609 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6nd5v"] Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.179666 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.190411 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6nd5v"] Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.334821 4594 generic.go:334] "Generic (PLEG): container finished" podID="75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" containerID="2347658a2fcdc86c1e5d50acdfc002387af5984a58e5e1e05fbae56f821115a4" exitCode=0 Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.334937 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvj7k" event={"ID":"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6","Type":"ContainerDied","Data":"2347658a2fcdc86c1e5d50acdfc002387af5984a58e5e1e05fbae56f821115a4"} Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.334989 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvj7k" event={"ID":"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6","Type":"ContainerStarted","Data":"f41599cf42e45e350de8a3ac21b0d15531379a32cd5b3b374ff013f5a98a9785"} Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.363312 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkt9t\" (UniqueName: \"kubernetes.io/projected/9e710a40-5535-4984-b73f-de35c0a9e18f-kube-api-access-wkt9t\") pod \"redhat-operators-6nd5v\" (UID: \"9e710a40-5535-4984-b73f-de35c0a9e18f\") " pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.363353 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e710a40-5535-4984-b73f-de35c0a9e18f-catalog-content\") pod \"redhat-operators-6nd5v\" (UID: \"9e710a40-5535-4984-b73f-de35c0a9e18f\") " pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.363398 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e710a40-5535-4984-b73f-de35c0a9e18f-utilities\") pod \"redhat-operators-6nd5v\" (UID: \"9e710a40-5535-4984-b73f-de35c0a9e18f\") " pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.464442 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e710a40-5535-4984-b73f-de35c0a9e18f-catalog-content\") pod \"redhat-operators-6nd5v\" (UID: \"9e710a40-5535-4984-b73f-de35c0a9e18f\") " pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.464533 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e710a40-5535-4984-b73f-de35c0a9e18f-utilities\") pod \"redhat-operators-6nd5v\" (UID: \"9e710a40-5535-4984-b73f-de35c0a9e18f\") " pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.464638 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkt9t\" (UniqueName: \"kubernetes.io/projected/9e710a40-5535-4984-b73f-de35c0a9e18f-kube-api-access-wkt9t\") pod \"redhat-operators-6nd5v\" (UID: \"9e710a40-5535-4984-b73f-de35c0a9e18f\") " pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.465453 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e710a40-5535-4984-b73f-de35c0a9e18f-catalog-content\") pod \"redhat-operators-6nd5v\" (UID: \"9e710a40-5535-4984-b73f-de35c0a9e18f\") " pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.466336 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e710a40-5535-4984-b73f-de35c0a9e18f-utilities\") pod \"redhat-operators-6nd5v\" (UID: \"9e710a40-5535-4984-b73f-de35c0a9e18f\") " pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.487327 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkt9t\" (UniqueName: \"kubernetes.io/projected/9e710a40-5535-4984-b73f-de35c0a9e18f-kube-api-access-wkt9t\") pod \"redhat-operators-6nd5v\" (UID: \"9e710a40-5535-4984-b73f-de35c0a9e18f\") " pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.493868 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.499354 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jgzzn" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.501664 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.655789 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.658592 4594 patch_prober.go:28] interesting pod/router-default-5444994796-8pvtn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 05:30:32 crc kubenswrapper[4594]: [-]has-synced failed: reason withheld Nov 29 05:30:32 crc kubenswrapper[4594]: [+]process-running ok Nov 29 05:30:32 crc kubenswrapper[4594]: healthz check failed Nov 29 05:30:32 crc kubenswrapper[4594]: I1129 05:30:32.658643 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8pvtn" podUID="c3e4199d-2fe7-4039-8f59-b8bab2beebd0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.230880 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.231861 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.233871 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.234300 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.236085 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.379912 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b051a5dc-424b-403c-9f3a-d44096f2a847-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b051a5dc-424b-403c-9f3a-d44096f2a847\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.379984 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b051a5dc-424b-403c-9f3a-d44096f2a847-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b051a5dc-424b-403c-9f3a-d44096f2a847\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.481762 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b051a5dc-424b-403c-9f3a-d44096f2a847-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b051a5dc-424b-403c-9f3a-d44096f2a847\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.481826 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b051a5dc-424b-403c-9f3a-d44096f2a847-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b051a5dc-424b-403c-9f3a-d44096f2a847\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.482167 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b051a5dc-424b-403c-9f3a-d44096f2a847-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b051a5dc-424b-403c-9f3a-d44096f2a847\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.497394 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b051a5dc-424b-403c-9f3a-d44096f2a847-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b051a5dc-424b-403c-9f3a-d44096f2a847\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.559665 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.658584 4594 patch_prober.go:28] interesting pod/router-default-5444994796-8pvtn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 05:30:33 crc kubenswrapper[4594]: [-]has-synced failed: reason withheld Nov 29 05:30:33 crc kubenswrapper[4594]: [+]process-running ok Nov 29 05:30:33 crc kubenswrapper[4594]: healthz check failed Nov 29 05:30:33 crc kubenswrapper[4594]: I1129 05:30:33.658666 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8pvtn" podUID="c3e4199d-2fe7-4039-8f59-b8bab2beebd0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 05:30:34 crc kubenswrapper[4594]: I1129 05:30:34.658793 4594 patch_prober.go:28] interesting pod/router-default-5444994796-8pvtn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 05:30:34 crc kubenswrapper[4594]: [-]has-synced failed: reason withheld Nov 29 05:30:34 crc kubenswrapper[4594]: [+]process-running ok Nov 29 05:30:34 crc kubenswrapper[4594]: healthz check failed Nov 29 05:30:34 crc kubenswrapper[4594]: I1129 05:30:34.658970 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8pvtn" podUID="c3e4199d-2fe7-4039-8f59-b8bab2beebd0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 05:30:34 crc kubenswrapper[4594]: W1129 05:30:34.685178 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod233cee4b_1c92_42c9_b42e_b2200d4fd842.slice/crio-58c3153596332298144bb2d0cb853cafa95e531db88a405e58f4868a4ef09fce WatchSource:0}: Error finding container 58c3153596332298144bb2d0cb853cafa95e531db88a405e58f4868a4ef09fce: Status 404 returned error can't find the container with id 58c3153596332298144bb2d0cb853cafa95e531db88a405e58f4868a4ef09fce Nov 29 05:30:34 crc kubenswrapper[4594]: I1129 05:30:34.716501 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 05:30:34 crc kubenswrapper[4594]: I1129 05:30:34.902118 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d838adaa-2be0-45a8-9252-c76cfbadb810-kube-api-access\") pod \"d838adaa-2be0-45a8-9252-c76cfbadb810\" (UID: \"d838adaa-2be0-45a8-9252-c76cfbadb810\") " Nov 29 05:30:34 crc kubenswrapper[4594]: I1129 05:30:34.902352 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d838adaa-2be0-45a8-9252-c76cfbadb810-kubelet-dir\") pod \"d838adaa-2be0-45a8-9252-c76cfbadb810\" (UID: \"d838adaa-2be0-45a8-9252-c76cfbadb810\") " Nov 29 05:30:34 crc kubenswrapper[4594]: I1129 05:30:34.902464 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d838adaa-2be0-45a8-9252-c76cfbadb810-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d838adaa-2be0-45a8-9252-c76cfbadb810" (UID: "d838adaa-2be0-45a8-9252-c76cfbadb810"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:30:34 crc kubenswrapper[4594]: I1129 05:30:34.902720 4594 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d838adaa-2be0-45a8-9252-c76cfbadb810-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 29 05:30:34 crc kubenswrapper[4594]: I1129 05:30:34.906144 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d838adaa-2be0-45a8-9252-c76cfbadb810-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d838adaa-2be0-45a8-9252-c76cfbadb810" (UID: "d838adaa-2be0-45a8-9252-c76cfbadb810"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:30:35 crc kubenswrapper[4594]: I1129 05:30:35.004233 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d838adaa-2be0-45a8-9252-c76cfbadb810-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 05:30:35 crc kubenswrapper[4594]: I1129 05:30:35.353466 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d838adaa-2be0-45a8-9252-c76cfbadb810","Type":"ContainerDied","Data":"c1b765ead6d9a914e91992622f46e0440f9fe1891e597cf6bbd8ade7a05f58fb"} Nov 29 05:30:35 crc kubenswrapper[4594]: I1129 05:30:35.353771 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1b765ead6d9a914e91992622f46e0440f9fe1891e597cf6bbd8ade7a05f58fb" Nov 29 05:30:35 crc kubenswrapper[4594]: I1129 05:30:35.353550 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 05:30:35 crc kubenswrapper[4594]: I1129 05:30:35.357124 4594 generic.go:334] "Generic (PLEG): container finished" podID="233cee4b-1c92-42c9-b42e-b2200d4fd842" containerID="af2f6fae5a8052ad621536764c6bbd415d988c68c73b68bdeb09d6c761609ec5" exitCode=0 Nov 29 05:30:35 crc kubenswrapper[4594]: I1129 05:30:35.357181 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mk4pz" event={"ID":"233cee4b-1c92-42c9-b42e-b2200d4fd842","Type":"ContainerDied","Data":"af2f6fae5a8052ad621536764c6bbd415d988c68c73b68bdeb09d6c761609ec5"} Nov 29 05:30:35 crc kubenswrapper[4594]: I1129 05:30:35.357219 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mk4pz" event={"ID":"233cee4b-1c92-42c9-b42e-b2200d4fd842","Type":"ContainerStarted","Data":"58c3153596332298144bb2d0cb853cafa95e531db88a405e58f4868a4ef09fce"} Nov 29 05:30:35 crc kubenswrapper[4594]: I1129 05:30:35.436324 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6nd5v"] Nov 29 05:30:35 crc kubenswrapper[4594]: I1129 05:30:35.438600 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8ptq2"] Nov 29 05:30:35 crc kubenswrapper[4594]: W1129 05:30:35.446110 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e710a40_5535_4984_b73f_de35c0a9e18f.slice/crio-5d57dfdf9b494d10ea64f22d6fc3608abce30a438e4117cebfb00b48e75043bf WatchSource:0}: Error finding container 5d57dfdf9b494d10ea64f22d6fc3608abce30a438e4117cebfb00b48e75043bf: Status 404 returned error can't find the container with id 5d57dfdf9b494d10ea64f22d6fc3608abce30a438e4117cebfb00b48e75043bf Nov 29 05:30:35 crc kubenswrapper[4594]: W1129 05:30:35.449030 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9b3dc9b_0c66_4af2_977d_9264c559a827.slice/crio-d70eb50c02646b1b7c842056abc579499fb93a5b4dbfb8e33451f70a5896db86 WatchSource:0}: Error finding container d70eb50c02646b1b7c842056abc579499fb93a5b4dbfb8e33451f70a5896db86: Status 404 returned error can't find the container with id d70eb50c02646b1b7c842056abc579499fb93a5b4dbfb8e33451f70a5896db86 Nov 29 05:30:35 crc kubenswrapper[4594]: I1129 05:30:35.493628 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 29 05:30:35 crc kubenswrapper[4594]: I1129 05:30:35.658909 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:35 crc kubenswrapper[4594]: I1129 05:30:35.661590 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8pvtn" Nov 29 05:30:36 crc kubenswrapper[4594]: I1129 05:30:36.364868 4594 generic.go:334] "Generic (PLEG): container finished" podID="b051a5dc-424b-403c-9f3a-d44096f2a847" containerID="fb47c37ad4b680c875cb2a8678294d13f59efc84f9280e809016f8677ca24e05" exitCode=0 Nov 29 05:30:36 crc kubenswrapper[4594]: I1129 05:30:36.364951 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b051a5dc-424b-403c-9f3a-d44096f2a847","Type":"ContainerDied","Data":"fb47c37ad4b680c875cb2a8678294d13f59efc84f9280e809016f8677ca24e05"} Nov 29 05:30:36 crc kubenswrapper[4594]: I1129 05:30:36.365347 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b051a5dc-424b-403c-9f3a-d44096f2a847","Type":"ContainerStarted","Data":"569788add4e3703ba827f21601dc42921aa535c4a3bb44fe6b36304c8dd869de"} Nov 29 05:30:36 crc kubenswrapper[4594]: I1129 05:30:36.368569 4594 generic.go:334] "Generic (PLEG): container finished" podID="c9b3dc9b-0c66-4af2-977d-9264c559a827" containerID="05205bd34a1bb15a74e40a53e8ed80350765840b945e6338da88a923da116863" exitCode=0 Nov 29 05:30:36 crc kubenswrapper[4594]: I1129 05:30:36.368644 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8ptq2" event={"ID":"c9b3dc9b-0c66-4af2-977d-9264c559a827","Type":"ContainerDied","Data":"05205bd34a1bb15a74e40a53e8ed80350765840b945e6338da88a923da116863"} Nov 29 05:30:36 crc kubenswrapper[4594]: I1129 05:30:36.368671 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8ptq2" event={"ID":"c9b3dc9b-0c66-4af2-977d-9264c559a827","Type":"ContainerStarted","Data":"d70eb50c02646b1b7c842056abc579499fb93a5b4dbfb8e33451f70a5896db86"} Nov 29 05:30:36 crc kubenswrapper[4594]: I1129 05:30:36.370843 4594 generic.go:334] "Generic (PLEG): container finished" podID="9e710a40-5535-4984-b73f-de35c0a9e18f" containerID="e073b5ce14c0575045a19ac89d3495bb86003bbf36fa0ef3ee23bce5db690b30" exitCode=0 Nov 29 05:30:36 crc kubenswrapper[4594]: I1129 05:30:36.370902 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nd5v" event={"ID":"9e710a40-5535-4984-b73f-de35c0a9e18f","Type":"ContainerDied","Data":"e073b5ce14c0575045a19ac89d3495bb86003bbf36fa0ef3ee23bce5db690b30"} Nov 29 05:30:36 crc kubenswrapper[4594]: I1129 05:30:36.370951 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nd5v" event={"ID":"9e710a40-5535-4984-b73f-de35c0a9e18f","Type":"ContainerStarted","Data":"5d57dfdf9b494d10ea64f22d6fc3608abce30a438e4117cebfb00b48e75043bf"} Nov 29 05:30:37 crc kubenswrapper[4594]: I1129 05:30:37.460918 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-g4pv8" Nov 29 05:30:38 crc kubenswrapper[4594]: I1129 05:30:38.975409 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:30:38 crc kubenswrapper[4594]: I1129 05:30:38.981881 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/217088b9-a48b-40c7-8d83-f9ff0eb24908-metrics-certs\") pod \"network-metrics-daemon-lzr56\" (UID: \"217088b9-a48b-40c7-8d83-f9ff0eb24908\") " pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:30:39 crc kubenswrapper[4594]: I1129 05:30:39.114227 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzr56" Nov 29 05:30:39 crc kubenswrapper[4594]: I1129 05:30:39.920145 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 05:30:40 crc kubenswrapper[4594]: I1129 05:30:40.088153 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b051a5dc-424b-403c-9f3a-d44096f2a847-kubelet-dir\") pod \"b051a5dc-424b-403c-9f3a-d44096f2a847\" (UID: \"b051a5dc-424b-403c-9f3a-d44096f2a847\") " Nov 29 05:30:40 crc kubenswrapper[4594]: I1129 05:30:40.088207 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b051a5dc-424b-403c-9f3a-d44096f2a847-kube-api-access\") pod \"b051a5dc-424b-403c-9f3a-d44096f2a847\" (UID: \"b051a5dc-424b-403c-9f3a-d44096f2a847\") " Nov 29 05:30:40 crc kubenswrapper[4594]: I1129 05:30:40.088202 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b051a5dc-424b-403c-9f3a-d44096f2a847-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b051a5dc-424b-403c-9f3a-d44096f2a847" (UID: "b051a5dc-424b-403c-9f3a-d44096f2a847"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:30:40 crc kubenswrapper[4594]: I1129 05:30:40.089337 4594 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b051a5dc-424b-403c-9f3a-d44096f2a847-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 29 05:30:40 crc kubenswrapper[4594]: I1129 05:30:40.093271 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b051a5dc-424b-403c-9f3a-d44096f2a847-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b051a5dc-424b-403c-9f3a-d44096f2a847" (UID: "b051a5dc-424b-403c-9f3a-d44096f2a847"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:30:40 crc kubenswrapper[4594]: I1129 05:30:40.190455 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b051a5dc-424b-403c-9f3a-d44096f2a847-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 05:30:40 crc kubenswrapper[4594]: I1129 05:30:40.405121 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b051a5dc-424b-403c-9f3a-d44096f2a847","Type":"ContainerDied","Data":"569788add4e3703ba827f21601dc42921aa535c4a3bb44fe6b36304c8dd869de"} Nov 29 05:30:40 crc kubenswrapper[4594]: I1129 05:30:40.405174 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="569788add4e3703ba827f21601dc42921aa535c4a3bb44fe6b36304c8dd869de" Nov 29 05:30:40 crc kubenswrapper[4594]: I1129 05:30:40.405240 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 05:30:41 crc kubenswrapper[4594]: I1129 05:30:41.923406 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:41 crc kubenswrapper[4594]: I1129 05:30:41.929594 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:30:44 crc kubenswrapper[4594]: I1129 05:30:44.216801 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lzr56"] Nov 29 05:30:45 crc kubenswrapper[4594]: I1129 05:30:45.800290 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:30:45 crc kubenswrapper[4594]: I1129 05:30:45.800857 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:30:46 crc kubenswrapper[4594]: I1129 05:30:46.450419 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lzr56" event={"ID":"217088b9-a48b-40c7-8d83-f9ff0eb24908","Type":"ContainerStarted","Data":"18e611d506c001a0c8367bf56242bd794baca3be6a450f1d5dc8f584e1190e78"} Nov 29 05:30:46 crc kubenswrapper[4594]: I1129 05:30:46.457739 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzf9l" event={"ID":"7ec39acd-e54b-4bf5-99e7-dae22ecfceda","Type":"ContainerStarted","Data":"138846ff53c4a760ac11ede014a43aa729c0cdfbbf0e6c2a7b82687322faabb6"} Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.482893 4594 generic.go:334] "Generic (PLEG): container finished" podID="c9b3dc9b-0c66-4af2-977d-9264c559a827" containerID="1e5b7774d662e70c16f62aff22b498aa3731e80ff00ed0ee8b12169402632ee3" exitCode=0 Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.483029 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8ptq2" event={"ID":"c9b3dc9b-0c66-4af2-977d-9264c559a827","Type":"ContainerDied","Data":"1e5b7774d662e70c16f62aff22b498aa3731e80ff00ed0ee8b12169402632ee3"} Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.486949 4594 generic.go:334] "Generic (PLEG): container finished" podID="2b771218-fa00-44fe-8cb7-35abccdf9e10" containerID="74ddff4dde2cb0e6a58265f42af817d17fbb04d36850a8613ae0a834fea109a2" exitCode=0 Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.487079 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7jx7" event={"ID":"2b771218-fa00-44fe-8cb7-35abccdf9e10","Type":"ContainerDied","Data":"74ddff4dde2cb0e6a58265f42af817d17fbb04d36850a8613ae0a834fea109a2"} Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.489998 4594 generic.go:334] "Generic (PLEG): container finished" podID="96d7568a-7c09-407d-b2f4-fa4110cb7ecd" containerID="d0bbff5e2ada16ac0d58bec8f050a4ef950b06315681689d7b3800e9390321cd" exitCode=0 Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.490068 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6pg58" event={"ID":"96d7568a-7c09-407d-b2f4-fa4110cb7ecd","Type":"ContainerDied","Data":"d0bbff5e2ada16ac0d58bec8f050a4ef950b06315681689d7b3800e9390321cd"} Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.493559 4594 generic.go:334] "Generic (PLEG): container finished" podID="233cee4b-1c92-42c9-b42e-b2200d4fd842" containerID="7528a65a533439f4cef885f0d301442b64596bed2d8cc7df05efb71f83462174" exitCode=0 Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.493764 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mk4pz" event={"ID":"233cee4b-1c92-42c9-b42e-b2200d4fd842","Type":"ContainerDied","Data":"7528a65a533439f4cef885f0d301442b64596bed2d8cc7df05efb71f83462174"} Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.496945 4594 generic.go:334] "Generic (PLEG): container finished" podID="9e710a40-5535-4984-b73f-de35c0a9e18f" containerID="597d455a1efd492ffe7588a018a8794a7cd8ede8ab85a24f6f5d7aeba2c75768" exitCode=0 Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.497012 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nd5v" event={"ID":"9e710a40-5535-4984-b73f-de35c0a9e18f","Type":"ContainerDied","Data":"597d455a1efd492ffe7588a018a8794a7cd8ede8ab85a24f6f5d7aeba2c75768"} Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.503298 4594 generic.go:334] "Generic (PLEG): container finished" podID="37a1effb-361a-4fcf-b9f1-f32692c38587" containerID="c10ed0b147f47379703440c01774a3a4469bc39c26beebb69e7aba16110c3cb4" exitCode=0 Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.503355 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcl4" event={"ID":"37a1effb-361a-4fcf-b9f1-f32692c38587","Type":"ContainerDied","Data":"c10ed0b147f47379703440c01774a3a4469bc39c26beebb69e7aba16110c3cb4"} Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.506766 4594 generic.go:334] "Generic (PLEG): container finished" podID="75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" containerID="4dfd4b94ec8f9116309b01b2ee85d8764efb71cfeedddaeeb4719278d427737d" exitCode=0 Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.506805 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvj7k" event={"ID":"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6","Type":"ContainerDied","Data":"4dfd4b94ec8f9116309b01b2ee85d8764efb71cfeedddaeeb4719278d427737d"} Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.510052 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lzr56" event={"ID":"217088b9-a48b-40c7-8d83-f9ff0eb24908","Type":"ContainerStarted","Data":"a126be462fde76dd39d750d0e616026bdaed1b911eb1c5547aeb5edeb1a5e031"} Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.510155 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lzr56" event={"ID":"217088b9-a48b-40c7-8d83-f9ff0eb24908","Type":"ContainerStarted","Data":"3d8d6b50662e0d35f729f12508350d5c0ba2ed865d8c8d1ec9d3301b8d2998eb"} Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.516889 4594 generic.go:334] "Generic (PLEG): container finished" podID="7ec39acd-e54b-4bf5-99e7-dae22ecfceda" containerID="138846ff53c4a760ac11ede014a43aa729c0cdfbbf0e6c2a7b82687322faabb6" exitCode=0 Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.516956 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzf9l" event={"ID":"7ec39acd-e54b-4bf5-99e7-dae22ecfceda","Type":"ContainerDied","Data":"138846ff53c4a760ac11ede014a43aa729c0cdfbbf0e6c2a7b82687322faabb6"} Nov 29 05:30:47 crc kubenswrapper[4594]: I1129 05:30:47.602914 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lzr56" podStartSLOduration=150.602890767 podStartE2EDuration="2m30.602890767s" podCreationTimestamp="2025-11-29 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:30:47.600317819 +0000 UTC m=+171.840827038" watchObservedRunningTime="2025-11-29 05:30:47.602890767 +0000 UTC m=+171.843399987" Nov 29 05:30:48 crc kubenswrapper[4594]: I1129 05:30:48.450526 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:30:48 crc kubenswrapper[4594]: I1129 05:30:48.548368 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8ptq2" event={"ID":"c9b3dc9b-0c66-4af2-977d-9264c559a827","Type":"ContainerStarted","Data":"df05ca65761e9cfdcf18fb9d190e6f8ec60d6b2759f454a6afa502e17bc13401"} Nov 29 05:30:48 crc kubenswrapper[4594]: I1129 05:30:48.555706 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvj7k" event={"ID":"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6","Type":"ContainerStarted","Data":"54f11601c350c13255016018ed5cd9993d5912e66a359c2b76a489b2003e3302"} Nov 29 05:30:48 crc kubenswrapper[4594]: I1129 05:30:48.567790 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8ptq2" podStartSLOduration=5.60836319 podStartE2EDuration="17.567574929s" podCreationTimestamp="2025-11-29 05:30:31 +0000 UTC" firstStartedPulling="2025-11-29 05:30:36.371569953 +0000 UTC m=+160.612079163" lastFinishedPulling="2025-11-29 05:30:48.330781682 +0000 UTC m=+172.571290902" observedRunningTime="2025-11-29 05:30:48.563315969 +0000 UTC m=+172.803825189" watchObservedRunningTime="2025-11-29 05:30:48.567574929 +0000 UTC m=+172.808084150" Nov 29 05:30:48 crc kubenswrapper[4594]: I1129 05:30:48.580017 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bvj7k" podStartSLOduration=4.920317972 podStartE2EDuration="18.580001597s" podCreationTimestamp="2025-11-29 05:30:30 +0000 UTC" firstStartedPulling="2025-11-29 05:30:34.680627939 +0000 UTC m=+158.921137159" lastFinishedPulling="2025-11-29 05:30:48.340311564 +0000 UTC m=+172.580820784" observedRunningTime="2025-11-29 05:30:48.57927079 +0000 UTC m=+172.819780010" watchObservedRunningTime="2025-11-29 05:30:48.580001597 +0000 UTC m=+172.820510817" Nov 29 05:30:49 crc kubenswrapper[4594]: I1129 05:30:49.562109 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzf9l" event={"ID":"7ec39acd-e54b-4bf5-99e7-dae22ecfceda","Type":"ContainerStarted","Data":"6b3c986dad39b8ebccd65bd98dba64461672fe866c1e13346ea50c44df6f2b37"} Nov 29 05:30:49 crc kubenswrapper[4594]: I1129 05:30:49.565030 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nd5v" event={"ID":"9e710a40-5535-4984-b73f-de35c0a9e18f","Type":"ContainerStarted","Data":"bde8b8776ad161b0a34b79e15f6b54def726ef1891ee2c0c646cdef7f7789782"} Nov 29 05:30:49 crc kubenswrapper[4594]: I1129 05:30:49.567429 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7jx7" event={"ID":"2b771218-fa00-44fe-8cb7-35abccdf9e10","Type":"ContainerStarted","Data":"1c3cde7c5290029e40130ecc5c6650c4a2375c737758e663face5f273eacab03"} Nov 29 05:30:49 crc kubenswrapper[4594]: I1129 05:30:49.569586 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcl4" event={"ID":"37a1effb-361a-4fcf-b9f1-f32692c38587","Type":"ContainerStarted","Data":"d5ca149bdf94cdf62e2abc13d460e8ab82aa261f1b0d93f88e420dcc7d5c093a"} Nov 29 05:30:49 crc kubenswrapper[4594]: I1129 05:30:49.571609 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6pg58" event={"ID":"96d7568a-7c09-407d-b2f4-fa4110cb7ecd","Type":"ContainerStarted","Data":"3ceab488a208421282057590799deccaf8c0f3d086fe687f5a2de25ab680238e"} Nov 29 05:30:49 crc kubenswrapper[4594]: I1129 05:30:49.574981 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mk4pz" event={"ID":"233cee4b-1c92-42c9-b42e-b2200d4fd842","Type":"ContainerStarted","Data":"41c59c691b965436523bda68f988151439f6237429806c430a078e82587bc338"} Nov 29 05:30:49 crc kubenswrapper[4594]: I1129 05:30:49.581837 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pzf9l" podStartSLOduration=2.453741488 podStartE2EDuration="21.581827682s" podCreationTimestamp="2025-11-29 05:30:28 +0000 UTC" firstStartedPulling="2025-11-29 05:30:29.280613077 +0000 UTC m=+153.521122286" lastFinishedPulling="2025-11-29 05:30:48.408699259 +0000 UTC m=+172.649208480" observedRunningTime="2025-11-29 05:30:49.57853644 +0000 UTC m=+173.819045660" watchObservedRunningTime="2025-11-29 05:30:49.581827682 +0000 UTC m=+173.822336902" Nov 29 05:30:49 crc kubenswrapper[4594]: I1129 05:30:49.597227 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6nd5v" podStartSLOduration=5.564998379 podStartE2EDuration="17.597204684s" podCreationTimestamp="2025-11-29 05:30:32 +0000 UTC" firstStartedPulling="2025-11-29 05:30:36.372971744 +0000 UTC m=+160.613480963" lastFinishedPulling="2025-11-29 05:30:48.405178049 +0000 UTC m=+172.645687268" observedRunningTime="2025-11-29 05:30:49.595245241 +0000 UTC m=+173.835754461" watchObservedRunningTime="2025-11-29 05:30:49.597204684 +0000 UTC m=+173.837713894" Nov 29 05:30:49 crc kubenswrapper[4594]: I1129 05:30:49.623133 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zrcl4" podStartSLOduration=3.506806416 podStartE2EDuration="21.623121796s" podCreationTimestamp="2025-11-29 05:30:28 +0000 UTC" firstStartedPulling="2025-11-29 05:30:30.297548443 +0000 UTC m=+154.538057663" lastFinishedPulling="2025-11-29 05:30:48.413863823 +0000 UTC m=+172.654373043" observedRunningTime="2025-11-29 05:30:49.609965865 +0000 UTC m=+173.850475086" watchObservedRunningTime="2025-11-29 05:30:49.623121796 +0000 UTC m=+173.863631016" Nov 29 05:30:49 crc kubenswrapper[4594]: I1129 05:30:49.626213 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mk4pz" podStartSLOduration=5.629930112 podStartE2EDuration="18.626207714s" podCreationTimestamp="2025-11-29 05:30:31 +0000 UTC" firstStartedPulling="2025-11-29 05:30:35.361204446 +0000 UTC m=+159.601713666" lastFinishedPulling="2025-11-29 05:30:48.357482047 +0000 UTC m=+172.597991268" observedRunningTime="2025-11-29 05:30:49.62266918 +0000 UTC m=+173.863178400" watchObservedRunningTime="2025-11-29 05:30:49.626207714 +0000 UTC m=+173.866716924" Nov 29 05:30:49 crc kubenswrapper[4594]: I1129 05:30:49.641932 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6pg58" podStartSLOduration=2.425510794 podStartE2EDuration="20.641926765s" podCreationTimestamp="2025-11-29 05:30:29 +0000 UTC" firstStartedPulling="2025-11-29 05:30:30.301288384 +0000 UTC m=+154.541797594" lastFinishedPulling="2025-11-29 05:30:48.517704345 +0000 UTC m=+172.758213565" observedRunningTime="2025-11-29 05:30:49.640922428 +0000 UTC m=+173.881431648" watchObservedRunningTime="2025-11-29 05:30:49.641926765 +0000 UTC m=+173.882435985" Nov 29 05:30:49 crc kubenswrapper[4594]: I1129 05:30:49.653589 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z7jx7" podStartSLOduration=3.464605249 podStartE2EDuration="21.653584796s" podCreationTimestamp="2025-11-29 05:30:28 +0000 UTC" firstStartedPulling="2025-11-29 05:30:30.291990435 +0000 UTC m=+154.532499656" lastFinishedPulling="2025-11-29 05:30:48.480969983 +0000 UTC m=+172.721479203" observedRunningTime="2025-11-29 05:30:49.652615534 +0000 UTC m=+173.893124754" watchObservedRunningTime="2025-11-29 05:30:49.653584796 +0000 UTC m=+173.894094016" Nov 29 05:30:51 crc kubenswrapper[4594]: I1129 05:30:51.106870 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:51 crc kubenswrapper[4594]: I1129 05:30:51.106927 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:51 crc kubenswrapper[4594]: I1129 05:30:51.181965 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:30:51 crc kubenswrapper[4594]: I1129 05:30:51.505135 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:51 crc kubenswrapper[4594]: I1129 05:30:51.505681 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:51 crc kubenswrapper[4594]: I1129 05:30:51.543422 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:30:52 crc kubenswrapper[4594]: I1129 05:30:52.105321 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:30:52 crc kubenswrapper[4594]: I1129 05:30:52.105642 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:30:52 crc kubenswrapper[4594]: I1129 05:30:52.503042 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:30:52 crc kubenswrapper[4594]: I1129 05:30:52.503366 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:30:53 crc kubenswrapper[4594]: I1129 05:30:53.143445 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8ptq2" podUID="c9b3dc9b-0c66-4af2-977d-9264c559a827" containerName="registry-server" probeResult="failure" output=< Nov 29 05:30:53 crc kubenswrapper[4594]: timeout: failed to connect service ":50051" within 1s Nov 29 05:30:53 crc kubenswrapper[4594]: > Nov 29 05:30:53 crc kubenswrapper[4594]: I1129 05:30:53.531977 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6nd5v" podUID="9e710a40-5535-4984-b73f-de35c0a9e18f" containerName="registry-server" probeResult="failure" output=< Nov 29 05:30:53 crc kubenswrapper[4594]: timeout: failed to connect service ":50051" within 1s Nov 29 05:30:53 crc kubenswrapper[4594]: > Nov 29 05:30:58 crc kubenswrapper[4594]: I1129 05:30:58.922100 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:58 crc kubenswrapper[4594]: I1129 05:30:58.923479 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:58 crc kubenswrapper[4594]: I1129 05:30:58.958644 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:59 crc kubenswrapper[4594]: I1129 05:30:59.108791 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:59 crc kubenswrapper[4594]: I1129 05:30:59.108920 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:59 crc kubenswrapper[4594]: I1129 05:30:59.143741 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:59 crc kubenswrapper[4594]: I1129 05:30:59.340779 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:59 crc kubenswrapper[4594]: I1129 05:30:59.341313 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:59 crc kubenswrapper[4594]: I1129 05:30:59.375760 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:30:59 crc kubenswrapper[4594]: I1129 05:30:59.520400 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:59 crc kubenswrapper[4594]: I1129 05:30:59.520467 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:59 crc kubenswrapper[4594]: I1129 05:30:59.554222 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:59 crc kubenswrapper[4594]: I1129 05:30:59.676550 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:30:59 crc kubenswrapper[4594]: I1129 05:30:59.680791 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:30:59 crc kubenswrapper[4594]: I1129 05:30:59.683016 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:30:59 crc kubenswrapper[4594]: I1129 05:30:59.687498 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:31:00 crc kubenswrapper[4594]: I1129 05:31:00.636846 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rwbhx"] Nov 29 05:31:00 crc kubenswrapper[4594]: I1129 05:31:00.783563 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6pg58"] Nov 29 05:31:01 crc kubenswrapper[4594]: I1129 05:31:01.143365 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:31:01 crc kubenswrapper[4594]: I1129 05:31:01.539370 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:31:01 crc kubenswrapper[4594]: I1129 05:31:01.655282 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6pg58" podUID="96d7568a-7c09-407d-b2f4-fa4110cb7ecd" containerName="registry-server" containerID="cri-o://3ceab488a208421282057590799deccaf8c0f3d086fe687f5a2de25ab680238e" gracePeriod=2 Nov 29 05:31:01 crc kubenswrapper[4594]: I1129 05:31:01.795871 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z7jx7"] Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.004635 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.137527 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.169085 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.178242 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-utilities\") pod \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\" (UID: \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\") " Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.178329 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh5jl\" (UniqueName: \"kubernetes.io/projected/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-kube-api-access-xh5jl\") pod \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\" (UID: \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\") " Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.178370 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-catalog-content\") pod \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\" (UID: \"96d7568a-7c09-407d-b2f4-fa4110cb7ecd\") " Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.179417 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-utilities" (OuterVolumeSpecName: "utilities") pod "96d7568a-7c09-407d-b2f4-fa4110cb7ecd" (UID: "96d7568a-7c09-407d-b2f4-fa4110cb7ecd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.190354 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-kube-api-access-xh5jl" (OuterVolumeSpecName: "kube-api-access-xh5jl") pod "96d7568a-7c09-407d-b2f4-fa4110cb7ecd" (UID: "96d7568a-7c09-407d-b2f4-fa4110cb7ecd"). InnerVolumeSpecName "kube-api-access-xh5jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.220695 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96d7568a-7c09-407d-b2f4-fa4110cb7ecd" (UID: "96d7568a-7c09-407d-b2f4-fa4110cb7ecd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.279973 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh5jl\" (UniqueName: \"kubernetes.io/projected/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-kube-api-access-xh5jl\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.280156 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.280383 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d7568a-7c09-407d-b2f4-fa4110cb7ecd-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.313271 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.535318 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.567319 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.661975 4594 generic.go:334] "Generic (PLEG): container finished" podID="96d7568a-7c09-407d-b2f4-fa4110cb7ecd" containerID="3ceab488a208421282057590799deccaf8c0f3d086fe687f5a2de25ab680238e" exitCode=0 Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.662030 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6pg58" event={"ID":"96d7568a-7c09-407d-b2f4-fa4110cb7ecd","Type":"ContainerDied","Data":"3ceab488a208421282057590799deccaf8c0f3d086fe687f5a2de25ab680238e"} Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.662061 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6pg58" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.662087 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6pg58" event={"ID":"96d7568a-7c09-407d-b2f4-fa4110cb7ecd","Type":"ContainerDied","Data":"78b038af0488da2f9ffdc40111cfac61bfc7d771b702f5ed4ec58ea1fee12304"} Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.662111 4594 scope.go:117] "RemoveContainer" containerID="3ceab488a208421282057590799deccaf8c0f3d086fe687f5a2de25ab680238e" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.662344 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z7jx7" podUID="2b771218-fa00-44fe-8cb7-35abccdf9e10" containerName="registry-server" containerID="cri-o://1c3cde7c5290029e40130ecc5c6650c4a2375c737758e663face5f273eacab03" gracePeriod=2 Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.676948 4594 scope.go:117] "RemoveContainer" containerID="d0bbff5e2ada16ac0d58bec8f050a4ef950b06315681689d7b3800e9390321cd" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.689563 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6pg58"] Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.691502 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nhk7j" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.692071 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6pg58"] Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.711589 4594 scope.go:117] "RemoveContainer" containerID="5bdb8e021b028922ce5790dbbd6484aea7ecee4b8332deddde7eaf0f7873f260" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.734543 4594 scope.go:117] "RemoveContainer" containerID="3ceab488a208421282057590799deccaf8c0f3d086fe687f5a2de25ab680238e" Nov 29 05:31:02 crc kubenswrapper[4594]: E1129 05:31:02.734949 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ceab488a208421282057590799deccaf8c0f3d086fe687f5a2de25ab680238e\": container with ID starting with 3ceab488a208421282057590799deccaf8c0f3d086fe687f5a2de25ab680238e not found: ID does not exist" containerID="3ceab488a208421282057590799deccaf8c0f3d086fe687f5a2de25ab680238e" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.734984 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ceab488a208421282057590799deccaf8c0f3d086fe687f5a2de25ab680238e"} err="failed to get container status \"3ceab488a208421282057590799deccaf8c0f3d086fe687f5a2de25ab680238e\": rpc error: code = NotFound desc = could not find container \"3ceab488a208421282057590799deccaf8c0f3d086fe687f5a2de25ab680238e\": container with ID starting with 3ceab488a208421282057590799deccaf8c0f3d086fe687f5a2de25ab680238e not found: ID does not exist" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.735018 4594 scope.go:117] "RemoveContainer" containerID="d0bbff5e2ada16ac0d58bec8f050a4ef950b06315681689d7b3800e9390321cd" Nov 29 05:31:02 crc kubenswrapper[4594]: E1129 05:31:02.735246 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0bbff5e2ada16ac0d58bec8f050a4ef950b06315681689d7b3800e9390321cd\": container with ID starting with d0bbff5e2ada16ac0d58bec8f050a4ef950b06315681689d7b3800e9390321cd not found: ID does not exist" containerID="d0bbff5e2ada16ac0d58bec8f050a4ef950b06315681689d7b3800e9390321cd" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.735283 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0bbff5e2ada16ac0d58bec8f050a4ef950b06315681689d7b3800e9390321cd"} err="failed to get container status \"d0bbff5e2ada16ac0d58bec8f050a4ef950b06315681689d7b3800e9390321cd\": rpc error: code = NotFound desc = could not find container \"d0bbff5e2ada16ac0d58bec8f050a4ef950b06315681689d7b3800e9390321cd\": container with ID starting with d0bbff5e2ada16ac0d58bec8f050a4ef950b06315681689d7b3800e9390321cd not found: ID does not exist" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.735296 4594 scope.go:117] "RemoveContainer" containerID="5bdb8e021b028922ce5790dbbd6484aea7ecee4b8332deddde7eaf0f7873f260" Nov 29 05:31:02 crc kubenswrapper[4594]: E1129 05:31:02.735487 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bdb8e021b028922ce5790dbbd6484aea7ecee4b8332deddde7eaf0f7873f260\": container with ID starting with 5bdb8e021b028922ce5790dbbd6484aea7ecee4b8332deddde7eaf0f7873f260 not found: ID does not exist" containerID="5bdb8e021b028922ce5790dbbd6484aea7ecee4b8332deddde7eaf0f7873f260" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.735551 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bdb8e021b028922ce5790dbbd6484aea7ecee4b8332deddde7eaf0f7873f260"} err="failed to get container status \"5bdb8e021b028922ce5790dbbd6484aea7ecee4b8332deddde7eaf0f7873f260\": rpc error: code = NotFound desc = could not find container \"5bdb8e021b028922ce5790dbbd6484aea7ecee4b8332deddde7eaf0f7873f260\": container with ID starting with 5bdb8e021b028922ce5790dbbd6484aea7ecee4b8332deddde7eaf0f7873f260 not found: ID does not exist" Nov 29 05:31:02 crc kubenswrapper[4594]: I1129 05:31:02.995841 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.192654 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b771218-fa00-44fe-8cb7-35abccdf9e10-catalog-content\") pod \"2b771218-fa00-44fe-8cb7-35abccdf9e10\" (UID: \"2b771218-fa00-44fe-8cb7-35abccdf9e10\") " Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.192694 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4qx7\" (UniqueName: \"kubernetes.io/projected/2b771218-fa00-44fe-8cb7-35abccdf9e10-kube-api-access-c4qx7\") pod \"2b771218-fa00-44fe-8cb7-35abccdf9e10\" (UID: \"2b771218-fa00-44fe-8cb7-35abccdf9e10\") " Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.192744 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b771218-fa00-44fe-8cb7-35abccdf9e10-utilities\") pod \"2b771218-fa00-44fe-8cb7-35abccdf9e10\" (UID: \"2b771218-fa00-44fe-8cb7-35abccdf9e10\") " Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.193379 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b771218-fa00-44fe-8cb7-35abccdf9e10-utilities" (OuterVolumeSpecName: "utilities") pod "2b771218-fa00-44fe-8cb7-35abccdf9e10" (UID: "2b771218-fa00-44fe-8cb7-35abccdf9e10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.197497 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b771218-fa00-44fe-8cb7-35abccdf9e10-kube-api-access-c4qx7" (OuterVolumeSpecName: "kube-api-access-c4qx7") pod "2b771218-fa00-44fe-8cb7-35abccdf9e10" (UID: "2b771218-fa00-44fe-8cb7-35abccdf9e10"). InnerVolumeSpecName "kube-api-access-c4qx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.230654 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b771218-fa00-44fe-8cb7-35abccdf9e10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b771218-fa00-44fe-8cb7-35abccdf9e10" (UID: "2b771218-fa00-44fe-8cb7-35abccdf9e10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.294773 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b771218-fa00-44fe-8cb7-35abccdf9e10-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.294824 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4qx7\" (UniqueName: \"kubernetes.io/projected/2b771218-fa00-44fe-8cb7-35abccdf9e10-kube-api-access-c4qx7\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.294842 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b771218-fa00-44fe-8cb7-35abccdf9e10-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.671232 4594 generic.go:334] "Generic (PLEG): container finished" podID="2b771218-fa00-44fe-8cb7-35abccdf9e10" containerID="1c3cde7c5290029e40130ecc5c6650c4a2375c737758e663face5f273eacab03" exitCode=0 Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.671310 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7jx7" event={"ID":"2b771218-fa00-44fe-8cb7-35abccdf9e10","Type":"ContainerDied","Data":"1c3cde7c5290029e40130ecc5c6650c4a2375c737758e663face5f273eacab03"} Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.671394 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7jx7" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.671427 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7jx7" event={"ID":"2b771218-fa00-44fe-8cb7-35abccdf9e10","Type":"ContainerDied","Data":"6aa146866aeb08c049a246bf684ff995fc493dab31b016b1d4176fdae83417cb"} Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.671467 4594 scope.go:117] "RemoveContainer" containerID="1c3cde7c5290029e40130ecc5c6650c4a2375c737758e663face5f273eacab03" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.692839 4594 scope.go:117] "RemoveContainer" containerID="74ddff4dde2cb0e6a58265f42af817d17fbb04d36850a8613ae0a834fea109a2" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.696350 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z7jx7"] Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.698179 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z7jx7"] Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.721518 4594 scope.go:117] "RemoveContainer" containerID="63a4e1f743175da576fafce9720a2bc6958f045124c0d7b78f9d7ec8a028724e" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.732491 4594 scope.go:117] "RemoveContainer" containerID="1c3cde7c5290029e40130ecc5c6650c4a2375c737758e663face5f273eacab03" Nov 29 05:31:03 crc kubenswrapper[4594]: E1129 05:31:03.732810 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c3cde7c5290029e40130ecc5c6650c4a2375c737758e663face5f273eacab03\": container with ID starting with 1c3cde7c5290029e40130ecc5c6650c4a2375c737758e663face5f273eacab03 not found: ID does not exist" containerID="1c3cde7c5290029e40130ecc5c6650c4a2375c737758e663face5f273eacab03" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.732851 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3cde7c5290029e40130ecc5c6650c4a2375c737758e663face5f273eacab03"} err="failed to get container status \"1c3cde7c5290029e40130ecc5c6650c4a2375c737758e663face5f273eacab03\": rpc error: code = NotFound desc = could not find container \"1c3cde7c5290029e40130ecc5c6650c4a2375c737758e663face5f273eacab03\": container with ID starting with 1c3cde7c5290029e40130ecc5c6650c4a2375c737758e663face5f273eacab03 not found: ID does not exist" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.732873 4594 scope.go:117] "RemoveContainer" containerID="74ddff4dde2cb0e6a58265f42af817d17fbb04d36850a8613ae0a834fea109a2" Nov 29 05:31:03 crc kubenswrapper[4594]: E1129 05:31:03.733191 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ddff4dde2cb0e6a58265f42af817d17fbb04d36850a8613ae0a834fea109a2\": container with ID starting with 74ddff4dde2cb0e6a58265f42af817d17fbb04d36850a8613ae0a834fea109a2 not found: ID does not exist" containerID="74ddff4dde2cb0e6a58265f42af817d17fbb04d36850a8613ae0a834fea109a2" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.733223 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ddff4dde2cb0e6a58265f42af817d17fbb04d36850a8613ae0a834fea109a2"} err="failed to get container status \"74ddff4dde2cb0e6a58265f42af817d17fbb04d36850a8613ae0a834fea109a2\": rpc error: code = NotFound desc = could not find container \"74ddff4dde2cb0e6a58265f42af817d17fbb04d36850a8613ae0a834fea109a2\": container with ID starting with 74ddff4dde2cb0e6a58265f42af817d17fbb04d36850a8613ae0a834fea109a2 not found: ID does not exist" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.733251 4594 scope.go:117] "RemoveContainer" containerID="63a4e1f743175da576fafce9720a2bc6958f045124c0d7b78f9d7ec8a028724e" Nov 29 05:31:03 crc kubenswrapper[4594]: E1129 05:31:03.733598 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63a4e1f743175da576fafce9720a2bc6958f045124c0d7b78f9d7ec8a028724e\": container with ID starting with 63a4e1f743175da576fafce9720a2bc6958f045124c0d7b78f9d7ec8a028724e not found: ID does not exist" containerID="63a4e1f743175da576fafce9720a2bc6958f045124c0d7b78f9d7ec8a028724e" Nov 29 05:31:03 crc kubenswrapper[4594]: I1129 05:31:03.733643 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63a4e1f743175da576fafce9720a2bc6958f045124c0d7b78f9d7ec8a028724e"} err="failed to get container status \"63a4e1f743175da576fafce9720a2bc6958f045124c0d7b78f9d7ec8a028724e\": rpc error: code = NotFound desc = could not find container \"63a4e1f743175da576fafce9720a2bc6958f045124c0d7b78f9d7ec8a028724e\": container with ID starting with 63a4e1f743175da576fafce9720a2bc6958f045124c0d7b78f9d7ec8a028724e not found: ID does not exist" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.091147 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b771218-fa00-44fe-8cb7-35abccdf9e10" path="/var/lib/kubelet/pods/2b771218-fa00-44fe-8cb7-35abccdf9e10/volumes" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.091817 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96d7568a-7c09-407d-b2f4-fa4110cb7ecd" path="/var/lib/kubelet/pods/96d7568a-7c09-407d-b2f4-fa4110cb7ecd/volumes" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.185492 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mk4pz"] Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.185726 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mk4pz" podUID="233cee4b-1c92-42c9-b42e-b2200d4fd842" containerName="registry-server" containerID="cri-o://41c59c691b965436523bda68f988151439f6237429806c430a078e82587bc338" gracePeriod=2 Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.537477 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.683894 4594 generic.go:334] "Generic (PLEG): container finished" podID="233cee4b-1c92-42c9-b42e-b2200d4fd842" containerID="41c59c691b965436523bda68f988151439f6237429806c430a078e82587bc338" exitCode=0 Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.683954 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mk4pz" event={"ID":"233cee4b-1c92-42c9-b42e-b2200d4fd842","Type":"ContainerDied","Data":"41c59c691b965436523bda68f988151439f6237429806c430a078e82587bc338"} Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.683988 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mk4pz" event={"ID":"233cee4b-1c92-42c9-b42e-b2200d4fd842","Type":"ContainerDied","Data":"58c3153596332298144bb2d0cb853cafa95e531db88a405e58f4868a4ef09fce"} Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.684025 4594 scope.go:117] "RemoveContainer" containerID="41c59c691b965436523bda68f988151439f6237429806c430a078e82587bc338" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.684036 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mk4pz" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.706221 4594 scope.go:117] "RemoveContainer" containerID="7528a65a533439f4cef885f0d301442b64596bed2d8cc7df05efb71f83462174" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.712461 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/233cee4b-1c92-42c9-b42e-b2200d4fd842-utilities\") pod \"233cee4b-1c92-42c9-b42e-b2200d4fd842\" (UID: \"233cee4b-1c92-42c9-b42e-b2200d4fd842\") " Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.712598 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r47t5\" (UniqueName: \"kubernetes.io/projected/233cee4b-1c92-42c9-b42e-b2200d4fd842-kube-api-access-r47t5\") pod \"233cee4b-1c92-42c9-b42e-b2200d4fd842\" (UID: \"233cee4b-1c92-42c9-b42e-b2200d4fd842\") " Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.712763 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/233cee4b-1c92-42c9-b42e-b2200d4fd842-catalog-content\") pod \"233cee4b-1c92-42c9-b42e-b2200d4fd842\" (UID: \"233cee4b-1c92-42c9-b42e-b2200d4fd842\") " Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.713357 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233cee4b-1c92-42c9-b42e-b2200d4fd842-utilities" (OuterVolumeSpecName: "utilities") pod "233cee4b-1c92-42c9-b42e-b2200d4fd842" (UID: "233cee4b-1c92-42c9-b42e-b2200d4fd842"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.716446 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233cee4b-1c92-42c9-b42e-b2200d4fd842-kube-api-access-r47t5" (OuterVolumeSpecName: "kube-api-access-r47t5") pod "233cee4b-1c92-42c9-b42e-b2200d4fd842" (UID: "233cee4b-1c92-42c9-b42e-b2200d4fd842"). InnerVolumeSpecName "kube-api-access-r47t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.718851 4594 scope.go:117] "RemoveContainer" containerID="af2f6fae5a8052ad621536764c6bbd415d988c68c73b68bdeb09d6c761609ec5" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.730529 4594 scope.go:117] "RemoveContainer" containerID="41c59c691b965436523bda68f988151439f6237429806c430a078e82587bc338" Nov 29 05:31:04 crc kubenswrapper[4594]: E1129 05:31:04.730965 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c59c691b965436523bda68f988151439f6237429806c430a078e82587bc338\": container with ID starting with 41c59c691b965436523bda68f988151439f6237429806c430a078e82587bc338 not found: ID does not exist" containerID="41c59c691b965436523bda68f988151439f6237429806c430a078e82587bc338" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.731001 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c59c691b965436523bda68f988151439f6237429806c430a078e82587bc338"} err="failed to get container status \"41c59c691b965436523bda68f988151439f6237429806c430a078e82587bc338\": rpc error: code = NotFound desc = could not find container \"41c59c691b965436523bda68f988151439f6237429806c430a078e82587bc338\": container with ID starting with 41c59c691b965436523bda68f988151439f6237429806c430a078e82587bc338 not found: ID does not exist" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.731023 4594 scope.go:117] "RemoveContainer" containerID="7528a65a533439f4cef885f0d301442b64596bed2d8cc7df05efb71f83462174" Nov 29 05:31:04 crc kubenswrapper[4594]: E1129 05:31:04.731410 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7528a65a533439f4cef885f0d301442b64596bed2d8cc7df05efb71f83462174\": container with ID starting with 7528a65a533439f4cef885f0d301442b64596bed2d8cc7df05efb71f83462174 not found: ID does not exist" containerID="7528a65a533439f4cef885f0d301442b64596bed2d8cc7df05efb71f83462174" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.731462 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7528a65a533439f4cef885f0d301442b64596bed2d8cc7df05efb71f83462174"} err="failed to get container status \"7528a65a533439f4cef885f0d301442b64596bed2d8cc7df05efb71f83462174\": rpc error: code = NotFound desc = could not find container \"7528a65a533439f4cef885f0d301442b64596bed2d8cc7df05efb71f83462174\": container with ID starting with 7528a65a533439f4cef885f0d301442b64596bed2d8cc7df05efb71f83462174 not found: ID does not exist" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.731495 4594 scope.go:117] "RemoveContainer" containerID="af2f6fae5a8052ad621536764c6bbd415d988c68c73b68bdeb09d6c761609ec5" Nov 29 05:31:04 crc kubenswrapper[4594]: E1129 05:31:04.731747 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af2f6fae5a8052ad621536764c6bbd415d988c68c73b68bdeb09d6c761609ec5\": container with ID starting with af2f6fae5a8052ad621536764c6bbd415d988c68c73b68bdeb09d6c761609ec5 not found: ID does not exist" containerID="af2f6fae5a8052ad621536764c6bbd415d988c68c73b68bdeb09d6c761609ec5" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.731756 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233cee4b-1c92-42c9-b42e-b2200d4fd842-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "233cee4b-1c92-42c9-b42e-b2200d4fd842" (UID: "233cee4b-1c92-42c9-b42e-b2200d4fd842"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.731775 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af2f6fae5a8052ad621536764c6bbd415d988c68c73b68bdeb09d6c761609ec5"} err="failed to get container status \"af2f6fae5a8052ad621536764c6bbd415d988c68c73b68bdeb09d6c761609ec5\": rpc error: code = NotFound desc = could not find container \"af2f6fae5a8052ad621536764c6bbd415d988c68c73b68bdeb09d6c761609ec5\": container with ID starting with af2f6fae5a8052ad621536764c6bbd415d988c68c73b68bdeb09d6c761609ec5 not found: ID does not exist" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.815058 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/233cee4b-1c92-42c9-b42e-b2200d4fd842-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.815095 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r47t5\" (UniqueName: \"kubernetes.io/projected/233cee4b-1c92-42c9-b42e-b2200d4fd842-kube-api-access-r47t5\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:04 crc kubenswrapper[4594]: I1129 05:31:04.815109 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/233cee4b-1c92-42c9-b42e-b2200d4fd842-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:05 crc kubenswrapper[4594]: I1129 05:31:05.014075 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mk4pz"] Nov 29 05:31:05 crc kubenswrapper[4594]: I1129 05:31:05.016159 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mk4pz"] Nov 29 05:31:06 crc kubenswrapper[4594]: I1129 05:31:06.089274 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233cee4b-1c92-42c9-b42e-b2200d4fd842" path="/var/lib/kubelet/pods/233cee4b-1c92-42c9-b42e-b2200d4fd842/volumes" Nov 29 05:31:06 crc kubenswrapper[4594]: I1129 05:31:06.587154 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6nd5v"] Nov 29 05:31:06 crc kubenswrapper[4594]: I1129 05:31:06.587452 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6nd5v" podUID="9e710a40-5535-4984-b73f-de35c0a9e18f" containerName="registry-server" containerID="cri-o://bde8b8776ad161b0a34b79e15f6b54def726ef1891ee2c0c646cdef7f7789782" gracePeriod=2 Nov 29 05:31:06 crc kubenswrapper[4594]: I1129 05:31:06.952026 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.139401 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e710a40-5535-4984-b73f-de35c0a9e18f-utilities\") pod \"9e710a40-5535-4984-b73f-de35c0a9e18f\" (UID: \"9e710a40-5535-4984-b73f-de35c0a9e18f\") " Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.139509 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e710a40-5535-4984-b73f-de35c0a9e18f-catalog-content\") pod \"9e710a40-5535-4984-b73f-de35c0a9e18f\" (UID: \"9e710a40-5535-4984-b73f-de35c0a9e18f\") " Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.139557 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkt9t\" (UniqueName: \"kubernetes.io/projected/9e710a40-5535-4984-b73f-de35c0a9e18f-kube-api-access-wkt9t\") pod \"9e710a40-5535-4984-b73f-de35c0a9e18f\" (UID: \"9e710a40-5535-4984-b73f-de35c0a9e18f\") " Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.143369 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e710a40-5535-4984-b73f-de35c0a9e18f-utilities" (OuterVolumeSpecName: "utilities") pod "9e710a40-5535-4984-b73f-de35c0a9e18f" (UID: "9e710a40-5535-4984-b73f-de35c0a9e18f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.148704 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e710a40-5535-4984-b73f-de35c0a9e18f-kube-api-access-wkt9t" (OuterVolumeSpecName: "kube-api-access-wkt9t") pod "9e710a40-5535-4984-b73f-de35c0a9e18f" (UID: "9e710a40-5535-4984-b73f-de35c0a9e18f"). InnerVolumeSpecName "kube-api-access-wkt9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.222785 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e710a40-5535-4984-b73f-de35c0a9e18f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e710a40-5535-4984-b73f-de35c0a9e18f" (UID: "9e710a40-5535-4984-b73f-de35c0a9e18f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.241090 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e710a40-5535-4984-b73f-de35c0a9e18f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.241119 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkt9t\" (UniqueName: \"kubernetes.io/projected/9e710a40-5535-4984-b73f-de35c0a9e18f-kube-api-access-wkt9t\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.241134 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e710a40-5535-4984-b73f-de35c0a9e18f-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.707175 4594 generic.go:334] "Generic (PLEG): container finished" podID="9e710a40-5535-4984-b73f-de35c0a9e18f" containerID="bde8b8776ad161b0a34b79e15f6b54def726ef1891ee2c0c646cdef7f7789782" exitCode=0 Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.707236 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6nd5v" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.707239 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nd5v" event={"ID":"9e710a40-5535-4984-b73f-de35c0a9e18f","Type":"ContainerDied","Data":"bde8b8776ad161b0a34b79e15f6b54def726ef1891ee2c0c646cdef7f7789782"} Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.707325 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nd5v" event={"ID":"9e710a40-5535-4984-b73f-de35c0a9e18f","Type":"ContainerDied","Data":"5d57dfdf9b494d10ea64f22d6fc3608abce30a438e4117cebfb00b48e75043bf"} Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.707362 4594 scope.go:117] "RemoveContainer" containerID="bde8b8776ad161b0a34b79e15f6b54def726ef1891ee2c0c646cdef7f7789782" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.734846 4594 scope.go:117] "RemoveContainer" containerID="597d455a1efd492ffe7588a018a8794a7cd8ede8ab85a24f6f5d7aeba2c75768" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.745598 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6nd5v"] Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.748354 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6nd5v"] Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.771395 4594 scope.go:117] "RemoveContainer" containerID="e073b5ce14c0575045a19ac89d3495bb86003bbf36fa0ef3ee23bce5db690b30" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.781915 4594 scope.go:117] "RemoveContainer" containerID="bde8b8776ad161b0a34b79e15f6b54def726ef1891ee2c0c646cdef7f7789782" Nov 29 05:31:07 crc kubenswrapper[4594]: E1129 05:31:07.782263 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde8b8776ad161b0a34b79e15f6b54def726ef1891ee2c0c646cdef7f7789782\": container with ID starting with bde8b8776ad161b0a34b79e15f6b54def726ef1891ee2c0c646cdef7f7789782 not found: ID does not exist" containerID="bde8b8776ad161b0a34b79e15f6b54def726ef1891ee2c0c646cdef7f7789782" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.782301 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde8b8776ad161b0a34b79e15f6b54def726ef1891ee2c0c646cdef7f7789782"} err="failed to get container status \"bde8b8776ad161b0a34b79e15f6b54def726ef1891ee2c0c646cdef7f7789782\": rpc error: code = NotFound desc = could not find container \"bde8b8776ad161b0a34b79e15f6b54def726ef1891ee2c0c646cdef7f7789782\": container with ID starting with bde8b8776ad161b0a34b79e15f6b54def726ef1891ee2c0c646cdef7f7789782 not found: ID does not exist" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.782325 4594 scope.go:117] "RemoveContainer" containerID="597d455a1efd492ffe7588a018a8794a7cd8ede8ab85a24f6f5d7aeba2c75768" Nov 29 05:31:07 crc kubenswrapper[4594]: E1129 05:31:07.782621 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"597d455a1efd492ffe7588a018a8794a7cd8ede8ab85a24f6f5d7aeba2c75768\": container with ID starting with 597d455a1efd492ffe7588a018a8794a7cd8ede8ab85a24f6f5d7aeba2c75768 not found: ID does not exist" containerID="597d455a1efd492ffe7588a018a8794a7cd8ede8ab85a24f6f5d7aeba2c75768" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.782653 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"597d455a1efd492ffe7588a018a8794a7cd8ede8ab85a24f6f5d7aeba2c75768"} err="failed to get container status \"597d455a1efd492ffe7588a018a8794a7cd8ede8ab85a24f6f5d7aeba2c75768\": rpc error: code = NotFound desc = could not find container \"597d455a1efd492ffe7588a018a8794a7cd8ede8ab85a24f6f5d7aeba2c75768\": container with ID starting with 597d455a1efd492ffe7588a018a8794a7cd8ede8ab85a24f6f5d7aeba2c75768 not found: ID does not exist" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.782675 4594 scope.go:117] "RemoveContainer" containerID="e073b5ce14c0575045a19ac89d3495bb86003bbf36fa0ef3ee23bce5db690b30" Nov 29 05:31:07 crc kubenswrapper[4594]: E1129 05:31:07.782968 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e073b5ce14c0575045a19ac89d3495bb86003bbf36fa0ef3ee23bce5db690b30\": container with ID starting with e073b5ce14c0575045a19ac89d3495bb86003bbf36fa0ef3ee23bce5db690b30 not found: ID does not exist" containerID="e073b5ce14c0575045a19ac89d3495bb86003bbf36fa0ef3ee23bce5db690b30" Nov 29 05:31:07 crc kubenswrapper[4594]: I1129 05:31:07.783004 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e073b5ce14c0575045a19ac89d3495bb86003bbf36fa0ef3ee23bce5db690b30"} err="failed to get container status \"e073b5ce14c0575045a19ac89d3495bb86003bbf36fa0ef3ee23bce5db690b30\": rpc error: code = NotFound desc = could not find container \"e073b5ce14c0575045a19ac89d3495bb86003bbf36fa0ef3ee23bce5db690b30\": container with ID starting with e073b5ce14c0575045a19ac89d3495bb86003bbf36fa0ef3ee23bce5db690b30 not found: ID does not exist" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.090557 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e710a40-5535-4984-b73f-de35c0a9e18f" path="/var/lib/kubelet/pods/9e710a40-5535-4984-b73f-de35c0a9e18f/volumes" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.825701 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.825988 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d7568a-7c09-407d-b2f4-fa4110cb7ecd" containerName="extract-content" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826001 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d7568a-7c09-407d-b2f4-fa4110cb7ecd" containerName="extract-content" Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.826011 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233cee4b-1c92-42c9-b42e-b2200d4fd842" containerName="registry-server" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826017 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="233cee4b-1c92-42c9-b42e-b2200d4fd842" containerName="registry-server" Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.826029 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233cee4b-1c92-42c9-b42e-b2200d4fd842" containerName="extract-content" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826040 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="233cee4b-1c92-42c9-b42e-b2200d4fd842" containerName="extract-content" Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.826052 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d7568a-7c09-407d-b2f4-fa4110cb7ecd" containerName="extract-utilities" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826059 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d7568a-7c09-407d-b2f4-fa4110cb7ecd" containerName="extract-utilities" Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.826068 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e710a40-5535-4984-b73f-de35c0a9e18f" containerName="extract-utilities" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826075 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e710a40-5535-4984-b73f-de35c0a9e18f" containerName="extract-utilities" Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.826086 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233cee4b-1c92-42c9-b42e-b2200d4fd842" containerName="extract-utilities" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826093 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="233cee4b-1c92-42c9-b42e-b2200d4fd842" containerName="extract-utilities" Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.826102 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b051a5dc-424b-403c-9f3a-d44096f2a847" containerName="pruner" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826108 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="b051a5dc-424b-403c-9f3a-d44096f2a847" containerName="pruner" Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.826117 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b771218-fa00-44fe-8cb7-35abccdf9e10" containerName="extract-utilities" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826124 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b771218-fa00-44fe-8cb7-35abccdf9e10" containerName="extract-utilities" Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.826153 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d838adaa-2be0-45a8-9252-c76cfbadb810" containerName="pruner" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826159 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="d838adaa-2be0-45a8-9252-c76cfbadb810" containerName="pruner" Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.826168 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e710a40-5535-4984-b73f-de35c0a9e18f" containerName="registry-server" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826175 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e710a40-5535-4984-b73f-de35c0a9e18f" containerName="registry-server" Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.826186 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d7568a-7c09-407d-b2f4-fa4110cb7ecd" containerName="registry-server" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826192 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d7568a-7c09-407d-b2f4-fa4110cb7ecd" containerName="registry-server" Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.826199 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b771218-fa00-44fe-8cb7-35abccdf9e10" containerName="extract-content" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826206 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b771218-fa00-44fe-8cb7-35abccdf9e10" containerName="extract-content" Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.826235 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b771218-fa00-44fe-8cb7-35abccdf9e10" containerName="registry-server" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826243 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b771218-fa00-44fe-8cb7-35abccdf9e10" containerName="registry-server" Nov 29 05:31:08 crc kubenswrapper[4594]: E1129 05:31:08.826280 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e710a40-5535-4984-b73f-de35c0a9e18f" containerName="extract-content" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826286 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e710a40-5535-4984-b73f-de35c0a9e18f" containerName="extract-content" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826492 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="d838adaa-2be0-45a8-9252-c76cfbadb810" containerName="pruner" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826504 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b771218-fa00-44fe-8cb7-35abccdf9e10" containerName="registry-server" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826516 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e710a40-5535-4984-b73f-de35c0a9e18f" containerName="registry-server" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826543 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="b051a5dc-424b-403c-9f3a-d44096f2a847" containerName="pruner" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826555 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="233cee4b-1c92-42c9-b42e-b2200d4fd842" containerName="registry-server" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.826564 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="96d7568a-7c09-407d-b2f4-fa4110cb7ecd" containerName="registry-server" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.827217 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.835706 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.835939 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.841830 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.958925 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/620a008f-7536-403d-940b-f1fa29e6611c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"620a008f-7536-403d-940b-f1fa29e6611c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 05:31:08 crc kubenswrapper[4594]: I1129 05:31:08.959078 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/620a008f-7536-403d-940b-f1fa29e6611c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"620a008f-7536-403d-940b-f1fa29e6611c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 05:31:09 crc kubenswrapper[4594]: I1129 05:31:09.059726 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/620a008f-7536-403d-940b-f1fa29e6611c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"620a008f-7536-403d-940b-f1fa29e6611c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 05:31:09 crc kubenswrapper[4594]: I1129 05:31:09.059816 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/620a008f-7536-403d-940b-f1fa29e6611c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"620a008f-7536-403d-940b-f1fa29e6611c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 05:31:09 crc kubenswrapper[4594]: I1129 05:31:09.059883 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/620a008f-7536-403d-940b-f1fa29e6611c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"620a008f-7536-403d-940b-f1fa29e6611c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 05:31:09 crc kubenswrapper[4594]: I1129 05:31:09.076661 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/620a008f-7536-403d-940b-f1fa29e6611c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"620a008f-7536-403d-940b-f1fa29e6611c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 05:31:09 crc kubenswrapper[4594]: I1129 05:31:09.153858 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 05:31:09 crc kubenswrapper[4594]: I1129 05:31:09.568270 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 29 05:31:09 crc kubenswrapper[4594]: I1129 05:31:09.724892 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"620a008f-7536-403d-940b-f1fa29e6611c","Type":"ContainerStarted","Data":"a6613cc11c46f148c510f55683058b83e4a42f872fb7460db0dc05d078f522b1"} Nov 29 05:31:10 crc kubenswrapper[4594]: I1129 05:31:10.732603 4594 generic.go:334] "Generic (PLEG): container finished" podID="620a008f-7536-403d-940b-f1fa29e6611c" containerID="33cdf948f2b86a73c2df9fd2711f52517cb57172191e581cba0c4cfca823d633" exitCode=0 Nov 29 05:31:10 crc kubenswrapper[4594]: I1129 05:31:10.732680 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"620a008f-7536-403d-940b-f1fa29e6611c","Type":"ContainerDied","Data":"33cdf948f2b86a73c2df9fd2711f52517cb57172191e581cba0c4cfca823d633"} Nov 29 05:31:11 crc kubenswrapper[4594]: I1129 05:31:11.926649 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 05:31:12 crc kubenswrapper[4594]: I1129 05:31:12.099600 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/620a008f-7536-403d-940b-f1fa29e6611c-kube-api-access\") pod \"620a008f-7536-403d-940b-f1fa29e6611c\" (UID: \"620a008f-7536-403d-940b-f1fa29e6611c\") " Nov 29 05:31:12 crc kubenswrapper[4594]: I1129 05:31:12.099817 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/620a008f-7536-403d-940b-f1fa29e6611c-kubelet-dir\") pod \"620a008f-7536-403d-940b-f1fa29e6611c\" (UID: \"620a008f-7536-403d-940b-f1fa29e6611c\") " Nov 29 05:31:12 crc kubenswrapper[4594]: I1129 05:31:12.100235 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/620a008f-7536-403d-940b-f1fa29e6611c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "620a008f-7536-403d-940b-f1fa29e6611c" (UID: "620a008f-7536-403d-940b-f1fa29e6611c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:31:12 crc kubenswrapper[4594]: I1129 05:31:12.106121 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/620a008f-7536-403d-940b-f1fa29e6611c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "620a008f-7536-403d-940b-f1fa29e6611c" (UID: "620a008f-7536-403d-940b-f1fa29e6611c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:31:12 crc kubenswrapper[4594]: I1129 05:31:12.202071 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/620a008f-7536-403d-940b-f1fa29e6611c-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:12 crc kubenswrapper[4594]: I1129 05:31:12.202110 4594 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/620a008f-7536-403d-940b-f1fa29e6611c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:12 crc kubenswrapper[4594]: I1129 05:31:12.745743 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"620a008f-7536-403d-940b-f1fa29e6611c","Type":"ContainerDied","Data":"a6613cc11c46f148c510f55683058b83e4a42f872fb7460db0dc05d078f522b1"} Nov 29 05:31:12 crc kubenswrapper[4594]: I1129 05:31:12.745780 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6613cc11c46f148c510f55683058b83e4a42f872fb7460db0dc05d078f522b1" Nov 29 05:31:12 crc kubenswrapper[4594]: I1129 05:31:12.745792 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 05:31:15 crc kubenswrapper[4594]: I1129 05:31:15.800490 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:31:15 crc kubenswrapper[4594]: I1129 05:31:15.800546 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.419845 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 29 05:31:16 crc kubenswrapper[4594]: E1129 05:31:16.420434 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="620a008f-7536-403d-940b-f1fa29e6611c" containerName="pruner" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.420450 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="620a008f-7536-403d-940b-f1fa29e6611c" containerName="pruner" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.420608 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="620a008f-7536-403d-940b-f1fa29e6611c" containerName="pruner" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.421075 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.423452 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.423746 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.428102 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.551331 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/031260c4-ed19-4221-b30f-d03a4abb131f-var-lock\") pod \"installer-9-crc\" (UID: \"031260c4-ed19-4221-b30f-d03a4abb131f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.551402 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/031260c4-ed19-4221-b30f-d03a4abb131f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"031260c4-ed19-4221-b30f-d03a4abb131f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.551454 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/031260c4-ed19-4221-b30f-d03a4abb131f-kube-api-access\") pod \"installer-9-crc\" (UID: \"031260c4-ed19-4221-b30f-d03a4abb131f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.652695 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/031260c4-ed19-4221-b30f-d03a4abb131f-var-lock\") pod \"installer-9-crc\" (UID: \"031260c4-ed19-4221-b30f-d03a4abb131f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.652758 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/031260c4-ed19-4221-b30f-d03a4abb131f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"031260c4-ed19-4221-b30f-d03a4abb131f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.652799 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/031260c4-ed19-4221-b30f-d03a4abb131f-kube-api-access\") pod \"installer-9-crc\" (UID: \"031260c4-ed19-4221-b30f-d03a4abb131f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.652837 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/031260c4-ed19-4221-b30f-d03a4abb131f-var-lock\") pod \"installer-9-crc\" (UID: \"031260c4-ed19-4221-b30f-d03a4abb131f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.652906 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/031260c4-ed19-4221-b30f-d03a4abb131f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"031260c4-ed19-4221-b30f-d03a4abb131f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.669700 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/031260c4-ed19-4221-b30f-d03a4abb131f-kube-api-access\") pod \"installer-9-crc\" (UID: \"031260c4-ed19-4221-b30f-d03a4abb131f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 05:31:16 crc kubenswrapper[4594]: I1129 05:31:16.740240 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 29 05:31:17 crc kubenswrapper[4594]: I1129 05:31:17.076312 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 29 05:31:17 crc kubenswrapper[4594]: I1129 05:31:17.771501 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"031260c4-ed19-4221-b30f-d03a4abb131f","Type":"ContainerStarted","Data":"98422c59ed93ed10ba2e043b866c620d8e1b8180094575d0f33089b2beecd1ca"} Nov 29 05:31:17 crc kubenswrapper[4594]: I1129 05:31:17.771755 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"031260c4-ed19-4221-b30f-d03a4abb131f","Type":"ContainerStarted","Data":"21ef458b66b8cfdecb6b7e6be47d1b64816cba96af88aad276b644f9e5b94a95"} Nov 29 05:31:17 crc kubenswrapper[4594]: I1129 05:31:17.782509 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.782487873 podStartE2EDuration="1.782487873s" podCreationTimestamp="2025-11-29 05:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:31:17.781446948 +0000 UTC m=+202.021956167" watchObservedRunningTime="2025-11-29 05:31:17.782487873 +0000 UTC m=+202.022997093" Nov 29 05:31:25 crc kubenswrapper[4594]: I1129 05:31:25.656893 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" podUID="fc2b6738-957d-4636-a723-c11207f29087" containerName="oauth-openshift" containerID="cri-o://f80c664fe46aa3d1aebd7951e3beebdc8058879b225b3934a4e87e8d61c76bf6" gracePeriod=15 Nov 29 05:31:25 crc kubenswrapper[4594]: I1129 05:31:25.814432 4594 generic.go:334] "Generic (PLEG): container finished" podID="fc2b6738-957d-4636-a723-c11207f29087" containerID="f80c664fe46aa3d1aebd7951e3beebdc8058879b225b3934a4e87e8d61c76bf6" exitCode=0 Nov 29 05:31:25 crc kubenswrapper[4594]: I1129 05:31:25.814510 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" event={"ID":"fc2b6738-957d-4636-a723-c11207f29087","Type":"ContainerDied","Data":"f80c664fe46aa3d1aebd7951e3beebdc8058879b225b3934a4e87e8d61c76bf6"} Nov 29 05:31:25 crc kubenswrapper[4594]: I1129 05:31:25.946702 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048114 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-login\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048451 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95f8s\" (UniqueName: \"kubernetes.io/projected/fc2b6738-957d-4636-a723-c11207f29087-kube-api-access-95f8s\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048479 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-audit-policies\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048499 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-trusted-ca-bundle\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048517 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-provider-selection\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048536 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-cliconfig\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048564 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-error\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048590 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fc2b6738-957d-4636-a723-c11207f29087-audit-dir\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048619 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-idp-0-file-data\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048639 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-session\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048660 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-serving-cert\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048676 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-router-certs\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048707 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-ocp-branding-template\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048732 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-service-ca\") pod \"fc2b6738-957d-4636-a723-c11207f29087\" (UID: \"fc2b6738-957d-4636-a723-c11207f29087\") " Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.048993 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc2b6738-957d-4636-a723-c11207f29087-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.049307 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.049325 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.049338 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.049875 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.054297 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.054788 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.054867 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2b6738-957d-4636-a723-c11207f29087-kube-api-access-95f8s" (OuterVolumeSpecName: "kube-api-access-95f8s") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "kube-api-access-95f8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.055363 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.055538 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.055857 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.055982 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.056113 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.057591 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "fc2b6738-957d-4636-a723-c11207f29087" (UID: "fc2b6738-957d-4636-a723-c11207f29087"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150762 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150788 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150800 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150810 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150819 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150850 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150859 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95f8s\" (UniqueName: \"kubernetes.io/projected/fc2b6738-957d-4636-a723-c11207f29087-kube-api-access-95f8s\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150869 4594 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150879 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150887 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150898 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150907 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150916 4594 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fc2b6738-957d-4636-a723-c11207f29087-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.150926 4594 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fc2b6738-957d-4636-a723-c11207f29087-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.568358 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6"] Nov 29 05:31:26 crc kubenswrapper[4594]: E1129 05:31:26.568594 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2b6738-957d-4636-a723-c11207f29087" containerName="oauth-openshift" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.568611 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2b6738-957d-4636-a723-c11207f29087" containerName="oauth-openshift" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.568715 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc2b6738-957d-4636-a723-c11207f29087" containerName="oauth-openshift" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.569059 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.577536 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6"] Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.655639 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-router-certs\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.655697 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.655729 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-user-template-login\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.656021 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.656103 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/968b501f-7444-4799-bed6-4fe5d59aacb9-audit-policies\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.656188 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-session\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.656223 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-user-template-error\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.656304 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.656330 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t478p\" (UniqueName: \"kubernetes.io/projected/968b501f-7444-4799-bed6-4fe5d59aacb9-kube-api-access-t478p\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.656380 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.656413 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/968b501f-7444-4799-bed6-4fe5d59aacb9-audit-dir\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.656502 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-service-ca\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.656611 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.656648 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.757858 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-session\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.757903 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-user-template-error\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.757927 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.757945 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t478p\" (UniqueName: \"kubernetes.io/projected/968b501f-7444-4799-bed6-4fe5d59aacb9-kube-api-access-t478p\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.757967 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.757998 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/968b501f-7444-4799-bed6-4fe5d59aacb9-audit-dir\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.758018 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-service-ca\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.758053 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.758070 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.758089 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-router-certs\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.758112 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.758135 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-user-template-login\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.758159 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.758176 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/968b501f-7444-4799-bed6-4fe5d59aacb9-audit-policies\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.758934 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/968b501f-7444-4799-bed6-4fe5d59aacb9-audit-policies\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.758935 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-service-ca\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.759248 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/968b501f-7444-4799-bed6-4fe5d59aacb9-audit-dir\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.759775 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.759864 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.761701 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-router-certs\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.762125 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-session\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.762396 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-user-template-login\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.762730 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.762720 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.763106 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.763390 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-user-template-error\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.764127 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/968b501f-7444-4799-bed6-4fe5d59aacb9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.772150 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t478p\" (UniqueName: \"kubernetes.io/projected/968b501f-7444-4799-bed6-4fe5d59aacb9-kube-api-access-t478p\") pod \"oauth-openshift-5c6cd8dcff-nbfd6\" (UID: \"968b501f-7444-4799-bed6-4fe5d59aacb9\") " pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.823571 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" event={"ID":"fc2b6738-957d-4636-a723-c11207f29087","Type":"ContainerDied","Data":"94c212b315e44e9411e4805f458604802ddc3c4a9efcff5dfdd4e0e28cefca9b"} Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.823640 4594 scope.go:117] "RemoveContainer" containerID="f80c664fe46aa3d1aebd7951e3beebdc8058879b225b3934a4e87e8d61c76bf6" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.823657 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rwbhx" Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.840768 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rwbhx"] Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.843032 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rwbhx"] Nov 29 05:31:26 crc kubenswrapper[4594]: I1129 05:31:26.880906 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:27 crc kubenswrapper[4594]: I1129 05:31:27.225683 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6"] Nov 29 05:31:27 crc kubenswrapper[4594]: I1129 05:31:27.833388 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" event={"ID":"968b501f-7444-4799-bed6-4fe5d59aacb9","Type":"ContainerStarted","Data":"e33a2510b948984e2b874eeb8fa733a84f05998ae7e4525bbc96b8a0bfddbab7"} Nov 29 05:31:27 crc kubenswrapper[4594]: I1129 05:31:27.833605 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" event={"ID":"968b501f-7444-4799-bed6-4fe5d59aacb9","Type":"ContainerStarted","Data":"6a8b68a3e0a130e4ba7cb99b40cfdae4c5c335f2bfea4f80a5d1126cb787da16"} Nov 29 05:31:27 crc kubenswrapper[4594]: I1129 05:31:27.834462 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:27 crc kubenswrapper[4594]: I1129 05:31:27.839801 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" Nov 29 05:31:27 crc kubenswrapper[4594]: I1129 05:31:27.854396 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5c6cd8dcff-nbfd6" podStartSLOduration=27.854371435 podStartE2EDuration="27.854371435s" podCreationTimestamp="2025-11-29 05:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:31:27.852642678 +0000 UTC m=+212.093151898" watchObservedRunningTime="2025-11-29 05:31:27.854371435 +0000 UTC m=+212.094880655" Nov 29 05:31:28 crc kubenswrapper[4594]: I1129 05:31:28.090752 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc2b6738-957d-4636-a723-c11207f29087" path="/var/lib/kubelet/pods/fc2b6738-957d-4636-a723-c11207f29087/volumes" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.047451 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pzf9l"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.048315 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pzf9l" podUID="7ec39acd-e54b-4bf5-99e7-dae22ecfceda" containerName="registry-server" containerID="cri-o://6b3c986dad39b8ebccd65bd98dba64461672fe866c1e13346ea50c44df6f2b37" gracePeriod=30 Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.054351 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrcl4"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.054562 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zrcl4" podUID="37a1effb-361a-4fcf-b9f1-f32692c38587" containerName="registry-server" containerID="cri-o://d5ca149bdf94cdf62e2abc13d460e8ab82aa261f1b0d93f88e420dcc7d5c093a" gracePeriod=30 Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.061769 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k9wmb"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.061929 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" podUID="c584b692-db15-4724-a208-491507a8474e" containerName="marketplace-operator" containerID="cri-o://2b6929827669e8b121dca40b372f59d2cf862f4027a8d5f8fc0cdc449575c3d6" gracePeriod=30 Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.067893 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvj7k"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.068093 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bvj7k" podUID="75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" containerName="registry-server" containerID="cri-o://54f11601c350c13255016018ed5cd9993d5912e66a359c2b76a489b2003e3302" gracePeriod=30 Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.074545 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clxws"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.075106 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-clxws" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.078531 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8ptq2"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.078819 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8ptq2" podUID="c9b3dc9b-0c66-4af2-977d-9264c559a827" containerName="registry-server" containerID="cri-o://df05ca65761e9cfdcf18fb9d190e6f8ec60d6b2759f454a6afa502e17bc13401" gracePeriod=30 Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.098080 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clxws"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.212174 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8b4bf1cb-440a-4e74-82c4-c122e9985bf3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-clxws\" (UID: \"8b4bf1cb-440a-4e74-82c4-c122e9985bf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-clxws" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.212517 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v4bh\" (UniqueName: \"kubernetes.io/projected/8b4bf1cb-440a-4e74-82c4-c122e9985bf3-kube-api-access-5v4bh\") pod \"marketplace-operator-79b997595-clxws\" (UID: \"8b4bf1cb-440a-4e74-82c4-c122e9985bf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-clxws" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.212560 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b4bf1cb-440a-4e74-82c4-c122e9985bf3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-clxws\" (UID: \"8b4bf1cb-440a-4e74-82c4-c122e9985bf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-clxws" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.314731 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8b4bf1cb-440a-4e74-82c4-c122e9985bf3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-clxws\" (UID: \"8b4bf1cb-440a-4e74-82c4-c122e9985bf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-clxws" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.314829 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v4bh\" (UniqueName: \"kubernetes.io/projected/8b4bf1cb-440a-4e74-82c4-c122e9985bf3-kube-api-access-5v4bh\") pod \"marketplace-operator-79b997595-clxws\" (UID: \"8b4bf1cb-440a-4e74-82c4-c122e9985bf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-clxws" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.314905 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b4bf1cb-440a-4e74-82c4-c122e9985bf3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-clxws\" (UID: \"8b4bf1cb-440a-4e74-82c4-c122e9985bf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-clxws" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.316728 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b4bf1cb-440a-4e74-82c4-c122e9985bf3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-clxws\" (UID: \"8b4bf1cb-440a-4e74-82c4-c122e9985bf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-clxws" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.323406 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8b4bf1cb-440a-4e74-82c4-c122e9985bf3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-clxws\" (UID: \"8b4bf1cb-440a-4e74-82c4-c122e9985bf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-clxws" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.330311 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v4bh\" (UniqueName: \"kubernetes.io/projected/8b4bf1cb-440a-4e74-82c4-c122e9985bf3-kube-api-access-5v4bh\") pod \"marketplace-operator-79b997595-clxws\" (UID: \"8b4bf1cb-440a-4e74-82c4-c122e9985bf3\") " pod="openshift-marketplace/marketplace-operator-79b997595-clxws" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.396476 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-clxws" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.442408 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.447400 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.453981 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.462709 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.463342 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.620511 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpcbv\" (UniqueName: \"kubernetes.io/projected/c9b3dc9b-0c66-4af2-977d-9264c559a827-kube-api-access-zpcbv\") pod \"c9b3dc9b-0c66-4af2-977d-9264c559a827\" (UID: \"c9b3dc9b-0c66-4af2-977d-9264c559a827\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.620784 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgdcd\" (UniqueName: \"kubernetes.io/projected/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-kube-api-access-bgdcd\") pod \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\" (UID: \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.620815 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-catalog-content\") pod \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\" (UID: \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.620843 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a1effb-361a-4fcf-b9f1-f32692c38587-catalog-content\") pod \"37a1effb-361a-4fcf-b9f1-f32692c38587\" (UID: \"37a1effb-361a-4fcf-b9f1-f32692c38587\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.620861 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxd2s\" (UniqueName: \"kubernetes.io/projected/37a1effb-361a-4fcf-b9f1-f32692c38587-kube-api-access-sxd2s\") pod \"37a1effb-361a-4fcf-b9f1-f32692c38587\" (UID: \"37a1effb-361a-4fcf-b9f1-f32692c38587\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.620885 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-catalog-content\") pod \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\" (UID: \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.620917 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-utilities\") pod \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\" (UID: \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.620945 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c584b692-db15-4724-a208-491507a8474e-marketplace-trusted-ca\") pod \"c584b692-db15-4724-a208-491507a8474e\" (UID: \"c584b692-db15-4724-a208-491507a8474e\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.620972 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b3dc9b-0c66-4af2-977d-9264c559a827-utilities\") pod \"c9b3dc9b-0c66-4af2-977d-9264c559a827\" (UID: \"c9b3dc9b-0c66-4af2-977d-9264c559a827\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.621011 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f8w8\" (UniqueName: \"kubernetes.io/projected/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-kube-api-access-9f8w8\") pod \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\" (UID: \"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.621033 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh2z7\" (UniqueName: \"kubernetes.io/projected/c584b692-db15-4724-a208-491507a8474e-kube-api-access-mh2z7\") pod \"c584b692-db15-4724-a208-491507a8474e\" (UID: \"c584b692-db15-4724-a208-491507a8474e\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.621096 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-utilities\") pod \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\" (UID: \"7ec39acd-e54b-4bf5-99e7-dae22ecfceda\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.621141 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b3dc9b-0c66-4af2-977d-9264c559a827-catalog-content\") pod \"c9b3dc9b-0c66-4af2-977d-9264c559a827\" (UID: \"c9b3dc9b-0c66-4af2-977d-9264c559a827\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.621165 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c584b692-db15-4724-a208-491507a8474e-marketplace-operator-metrics\") pod \"c584b692-db15-4724-a208-491507a8474e\" (UID: \"c584b692-db15-4724-a208-491507a8474e\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.621183 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a1effb-361a-4fcf-b9f1-f32692c38587-utilities\") pod \"37a1effb-361a-4fcf-b9f1-f32692c38587\" (UID: \"37a1effb-361a-4fcf-b9f1-f32692c38587\") " Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.622129 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c584b692-db15-4724-a208-491507a8474e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c584b692-db15-4724-a208-491507a8474e" (UID: "c584b692-db15-4724-a208-491507a8474e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.622319 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a1effb-361a-4fcf-b9f1-f32692c38587-utilities" (OuterVolumeSpecName: "utilities") pod "37a1effb-361a-4fcf-b9f1-f32692c38587" (UID: "37a1effb-361a-4fcf-b9f1-f32692c38587"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.622533 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-utilities" (OuterVolumeSpecName: "utilities") pod "7ec39acd-e54b-4bf5-99e7-dae22ecfceda" (UID: "7ec39acd-e54b-4bf5-99e7-dae22ecfceda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.622658 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b3dc9b-0c66-4af2-977d-9264c559a827-utilities" (OuterVolumeSpecName: "utilities") pod "c9b3dc9b-0c66-4af2-977d-9264c559a827" (UID: "c9b3dc9b-0c66-4af2-977d-9264c559a827"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.622697 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-utilities" (OuterVolumeSpecName: "utilities") pod "75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" (UID: "75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.625952 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-kube-api-access-bgdcd" (OuterVolumeSpecName: "kube-api-access-bgdcd") pod "7ec39acd-e54b-4bf5-99e7-dae22ecfceda" (UID: "7ec39acd-e54b-4bf5-99e7-dae22ecfceda"). InnerVolumeSpecName "kube-api-access-bgdcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.626083 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c584b692-db15-4724-a208-491507a8474e-kube-api-access-mh2z7" (OuterVolumeSpecName: "kube-api-access-mh2z7") pod "c584b692-db15-4724-a208-491507a8474e" (UID: "c584b692-db15-4724-a208-491507a8474e"). InnerVolumeSpecName "kube-api-access-mh2z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.626199 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-kube-api-access-9f8w8" (OuterVolumeSpecName: "kube-api-access-9f8w8") pod "75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" (UID: "75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6"). InnerVolumeSpecName "kube-api-access-9f8w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.626462 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a1effb-361a-4fcf-b9f1-f32692c38587-kube-api-access-sxd2s" (OuterVolumeSpecName: "kube-api-access-sxd2s") pod "37a1effb-361a-4fcf-b9f1-f32692c38587" (UID: "37a1effb-361a-4fcf-b9f1-f32692c38587"). InnerVolumeSpecName "kube-api-access-sxd2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.626477 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c584b692-db15-4724-a208-491507a8474e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c584b692-db15-4724-a208-491507a8474e" (UID: "c584b692-db15-4724-a208-491507a8474e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.626676 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b3dc9b-0c66-4af2-977d-9264c559a827-kube-api-access-zpcbv" (OuterVolumeSpecName: "kube-api-access-zpcbv") pod "c9b3dc9b-0c66-4af2-977d-9264c559a827" (UID: "c9b3dc9b-0c66-4af2-977d-9264c559a827"). InnerVolumeSpecName "kube-api-access-zpcbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.643113 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" (UID: "75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.678488 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a1effb-361a-4fcf-b9f1-f32692c38587-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37a1effb-361a-4fcf-b9f1-f32692c38587" (UID: "37a1effb-361a-4fcf-b9f1-f32692c38587"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.679716 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ec39acd-e54b-4bf5-99e7-dae22ecfceda" (UID: "7ec39acd-e54b-4bf5-99e7-dae22ecfceda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.715841 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b3dc9b-0c66-4af2-977d-9264c559a827-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9b3dc9b-0c66-4af2-977d-9264c559a827" (UID: "c9b3dc9b-0c66-4af2-977d-9264c559a827"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.722637 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh2z7\" (UniqueName: \"kubernetes.io/projected/c584b692-db15-4724-a208-491507a8474e-kube-api-access-mh2z7\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.722724 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.722796 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b3dc9b-0c66-4af2-977d-9264c559a827-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.722856 4594 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c584b692-db15-4724-a208-491507a8474e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.722915 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a1effb-361a-4fcf-b9f1-f32692c38587-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.722966 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpcbv\" (UniqueName: \"kubernetes.io/projected/c9b3dc9b-0c66-4af2-977d-9264c559a827-kube-api-access-zpcbv\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.723023 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgdcd\" (UniqueName: \"kubernetes.io/projected/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-kube-api-access-bgdcd\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.723074 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.723131 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a1effb-361a-4fcf-b9f1-f32692c38587-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.723185 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxd2s\" (UniqueName: \"kubernetes.io/projected/37a1effb-361a-4fcf-b9f1-f32692c38587-kube-api-access-sxd2s\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.723248 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec39acd-e54b-4bf5-99e7-dae22ecfceda-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.723323 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.723372 4594 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c584b692-db15-4724-a208-491507a8474e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.723418 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b3dc9b-0c66-4af2-977d-9264c559a827-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.723475 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f8w8\" (UniqueName: \"kubernetes.io/projected/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6-kube-api-access-9f8w8\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.813527 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clxws"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.908739 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clxws" event={"ID":"8b4bf1cb-440a-4e74-82c4-c122e9985bf3","Type":"ContainerStarted","Data":"973fddd6b3fb37b56ce9a519dbe54f192535053ad47795e9beb88e3f9ec416e7"} Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.911146 4594 generic.go:334] "Generic (PLEG): container finished" podID="c584b692-db15-4724-a208-491507a8474e" containerID="2b6929827669e8b121dca40b372f59d2cf862f4027a8d5f8fc0cdc449575c3d6" exitCode=0 Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.911276 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" event={"ID":"c584b692-db15-4724-a208-491507a8474e","Type":"ContainerDied","Data":"2b6929827669e8b121dca40b372f59d2cf862f4027a8d5f8fc0cdc449575c3d6"} Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.911439 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" event={"ID":"c584b692-db15-4724-a208-491507a8474e","Type":"ContainerDied","Data":"40f3a3bfc4f4b842c3f55bc1198befae401b30ba53f91ac872864779204899be"} Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.911471 4594 scope.go:117] "RemoveContainer" containerID="2b6929827669e8b121dca40b372f59d2cf862f4027a8d5f8fc0cdc449575c3d6" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.911318 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k9wmb" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.916458 4594 generic.go:334] "Generic (PLEG): container finished" podID="37a1effb-361a-4fcf-b9f1-f32692c38587" containerID="d5ca149bdf94cdf62e2abc13d460e8ab82aa261f1b0d93f88e420dcc7d5c093a" exitCode=0 Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.916517 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcl4" event={"ID":"37a1effb-361a-4fcf-b9f1-f32692c38587","Type":"ContainerDied","Data":"d5ca149bdf94cdf62e2abc13d460e8ab82aa261f1b0d93f88e420dcc7d5c093a"} Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.916545 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcl4" event={"ID":"37a1effb-361a-4fcf-b9f1-f32692c38587","Type":"ContainerDied","Data":"a5cd9c879b773898be6410551e66b352f57f2b1e96f06ae2185d9bd6a7aacbd9"} Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.916631 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrcl4" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.920312 4594 generic.go:334] "Generic (PLEG): container finished" podID="75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" containerID="54f11601c350c13255016018ed5cd9993d5912e66a359c2b76a489b2003e3302" exitCode=0 Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.920408 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvj7k" event={"ID":"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6","Type":"ContainerDied","Data":"54f11601c350c13255016018ed5cd9993d5912e66a359c2b76a489b2003e3302"} Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.920434 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvj7k" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.920446 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvj7k" event={"ID":"75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6","Type":"ContainerDied","Data":"f41599cf42e45e350de8a3ac21b0d15531379a32cd5b3b374ff013f5a98a9785"} Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.925523 4594 generic.go:334] "Generic (PLEG): container finished" podID="c9b3dc9b-0c66-4af2-977d-9264c559a827" containerID="df05ca65761e9cfdcf18fb9d190e6f8ec60d6b2759f454a6afa502e17bc13401" exitCode=0 Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.925560 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8ptq2" event={"ID":"c9b3dc9b-0c66-4af2-977d-9264c559a827","Type":"ContainerDied","Data":"df05ca65761e9cfdcf18fb9d190e6f8ec60d6b2759f454a6afa502e17bc13401"} Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.925598 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8ptq2" event={"ID":"c9b3dc9b-0c66-4af2-977d-9264c559a827","Type":"ContainerDied","Data":"d70eb50c02646b1b7c842056abc579499fb93a5b4dbfb8e33451f70a5896db86"} Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.925817 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8ptq2" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.928472 4594 generic.go:334] "Generic (PLEG): container finished" podID="7ec39acd-e54b-4bf5-99e7-dae22ecfceda" containerID="6b3c986dad39b8ebccd65bd98dba64461672fe866c1e13346ea50c44df6f2b37" exitCode=0 Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.928514 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzf9l" event={"ID":"7ec39acd-e54b-4bf5-99e7-dae22ecfceda","Type":"ContainerDied","Data":"6b3c986dad39b8ebccd65bd98dba64461672fe866c1e13346ea50c44df6f2b37"} Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.928559 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzf9l" event={"ID":"7ec39acd-e54b-4bf5-99e7-dae22ecfceda","Type":"ContainerDied","Data":"29eb2eb197fa56bd424a32c7f5f969c581c17e59531904cf7486b0efbddbc921"} Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.928591 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pzf9l" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.934280 4594 scope.go:117] "RemoveContainer" containerID="2b6929827669e8b121dca40b372f59d2cf862f4027a8d5f8fc0cdc449575c3d6" Nov 29 05:31:40 crc kubenswrapper[4594]: E1129 05:31:40.934687 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6929827669e8b121dca40b372f59d2cf862f4027a8d5f8fc0cdc449575c3d6\": container with ID starting with 2b6929827669e8b121dca40b372f59d2cf862f4027a8d5f8fc0cdc449575c3d6 not found: ID does not exist" containerID="2b6929827669e8b121dca40b372f59d2cf862f4027a8d5f8fc0cdc449575c3d6" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.934724 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6929827669e8b121dca40b372f59d2cf862f4027a8d5f8fc0cdc449575c3d6"} err="failed to get container status \"2b6929827669e8b121dca40b372f59d2cf862f4027a8d5f8fc0cdc449575c3d6\": rpc error: code = NotFound desc = could not find container \"2b6929827669e8b121dca40b372f59d2cf862f4027a8d5f8fc0cdc449575c3d6\": container with ID starting with 2b6929827669e8b121dca40b372f59d2cf862f4027a8d5f8fc0cdc449575c3d6 not found: ID does not exist" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.934748 4594 scope.go:117] "RemoveContainer" containerID="d5ca149bdf94cdf62e2abc13d460e8ab82aa261f1b0d93f88e420dcc7d5c093a" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.940155 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k9wmb"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.944434 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k9wmb"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.956035 4594 scope.go:117] "RemoveContainer" containerID="c10ed0b147f47379703440c01774a3a4469bc39c26beebb69e7aba16110c3cb4" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.956580 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrcl4"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.958503 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zrcl4"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.967483 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvj7k"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.976814 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvj7k"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.985771 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pzf9l"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.989606 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pzf9l"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.993015 4594 scope.go:117] "RemoveContainer" containerID="79e628f49a0f5d60d235fa9b9578e796c849a5ed0e7ba33928b7ae59e4bc3def" Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.998296 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8ptq2"] Nov 29 05:31:40 crc kubenswrapper[4594]: I1129 05:31:40.999423 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8ptq2"] Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.011851 4594 scope.go:117] "RemoveContainer" containerID="d5ca149bdf94cdf62e2abc13d460e8ab82aa261f1b0d93f88e420dcc7d5c093a" Nov 29 05:31:41 crc kubenswrapper[4594]: E1129 05:31:41.012246 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ca149bdf94cdf62e2abc13d460e8ab82aa261f1b0d93f88e420dcc7d5c093a\": container with ID starting with d5ca149bdf94cdf62e2abc13d460e8ab82aa261f1b0d93f88e420dcc7d5c093a not found: ID does not exist" containerID="d5ca149bdf94cdf62e2abc13d460e8ab82aa261f1b0d93f88e420dcc7d5c093a" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.012312 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ca149bdf94cdf62e2abc13d460e8ab82aa261f1b0d93f88e420dcc7d5c093a"} err="failed to get container status \"d5ca149bdf94cdf62e2abc13d460e8ab82aa261f1b0d93f88e420dcc7d5c093a\": rpc error: code = NotFound desc = could not find container \"d5ca149bdf94cdf62e2abc13d460e8ab82aa261f1b0d93f88e420dcc7d5c093a\": container with ID starting with d5ca149bdf94cdf62e2abc13d460e8ab82aa261f1b0d93f88e420dcc7d5c093a not found: ID does not exist" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.012340 4594 scope.go:117] "RemoveContainer" containerID="c10ed0b147f47379703440c01774a3a4469bc39c26beebb69e7aba16110c3cb4" Nov 29 05:31:41 crc kubenswrapper[4594]: E1129 05:31:41.012673 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c10ed0b147f47379703440c01774a3a4469bc39c26beebb69e7aba16110c3cb4\": container with ID starting with c10ed0b147f47379703440c01774a3a4469bc39c26beebb69e7aba16110c3cb4 not found: ID does not exist" containerID="c10ed0b147f47379703440c01774a3a4469bc39c26beebb69e7aba16110c3cb4" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.012704 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c10ed0b147f47379703440c01774a3a4469bc39c26beebb69e7aba16110c3cb4"} err="failed to get container status \"c10ed0b147f47379703440c01774a3a4469bc39c26beebb69e7aba16110c3cb4\": rpc error: code = NotFound desc = could not find container \"c10ed0b147f47379703440c01774a3a4469bc39c26beebb69e7aba16110c3cb4\": container with ID starting with c10ed0b147f47379703440c01774a3a4469bc39c26beebb69e7aba16110c3cb4 not found: ID does not exist" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.012729 4594 scope.go:117] "RemoveContainer" containerID="79e628f49a0f5d60d235fa9b9578e796c849a5ed0e7ba33928b7ae59e4bc3def" Nov 29 05:31:41 crc kubenswrapper[4594]: E1129 05:31:41.013066 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79e628f49a0f5d60d235fa9b9578e796c849a5ed0e7ba33928b7ae59e4bc3def\": container with ID starting with 79e628f49a0f5d60d235fa9b9578e796c849a5ed0e7ba33928b7ae59e4bc3def not found: ID does not exist" containerID="79e628f49a0f5d60d235fa9b9578e796c849a5ed0e7ba33928b7ae59e4bc3def" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.013096 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e628f49a0f5d60d235fa9b9578e796c849a5ed0e7ba33928b7ae59e4bc3def"} err="failed to get container status \"79e628f49a0f5d60d235fa9b9578e796c849a5ed0e7ba33928b7ae59e4bc3def\": rpc error: code = NotFound desc = could not find container \"79e628f49a0f5d60d235fa9b9578e796c849a5ed0e7ba33928b7ae59e4bc3def\": container with ID starting with 79e628f49a0f5d60d235fa9b9578e796c849a5ed0e7ba33928b7ae59e4bc3def not found: ID does not exist" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.013113 4594 scope.go:117] "RemoveContainer" containerID="54f11601c350c13255016018ed5cd9993d5912e66a359c2b76a489b2003e3302" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.024819 4594 scope.go:117] "RemoveContainer" containerID="4dfd4b94ec8f9116309b01b2ee85d8764efb71cfeedddaeeb4719278d427737d" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.038300 4594 scope.go:117] "RemoveContainer" containerID="2347658a2fcdc86c1e5d50acdfc002387af5984a58e5e1e05fbae56f821115a4" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.054017 4594 scope.go:117] "RemoveContainer" containerID="54f11601c350c13255016018ed5cd9993d5912e66a359c2b76a489b2003e3302" Nov 29 05:31:41 crc kubenswrapper[4594]: E1129 05:31:41.054457 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f11601c350c13255016018ed5cd9993d5912e66a359c2b76a489b2003e3302\": container with ID starting with 54f11601c350c13255016018ed5cd9993d5912e66a359c2b76a489b2003e3302 not found: ID does not exist" containerID="54f11601c350c13255016018ed5cd9993d5912e66a359c2b76a489b2003e3302" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.054477 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f11601c350c13255016018ed5cd9993d5912e66a359c2b76a489b2003e3302"} err="failed to get container status \"54f11601c350c13255016018ed5cd9993d5912e66a359c2b76a489b2003e3302\": rpc error: code = NotFound desc = could not find container \"54f11601c350c13255016018ed5cd9993d5912e66a359c2b76a489b2003e3302\": container with ID starting with 54f11601c350c13255016018ed5cd9993d5912e66a359c2b76a489b2003e3302 not found: ID does not exist" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.054492 4594 scope.go:117] "RemoveContainer" containerID="4dfd4b94ec8f9116309b01b2ee85d8764efb71cfeedddaeeb4719278d427737d" Nov 29 05:31:41 crc kubenswrapper[4594]: E1129 05:31:41.054799 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dfd4b94ec8f9116309b01b2ee85d8764efb71cfeedddaeeb4719278d427737d\": container with ID starting with 4dfd4b94ec8f9116309b01b2ee85d8764efb71cfeedddaeeb4719278d427737d not found: ID does not exist" containerID="4dfd4b94ec8f9116309b01b2ee85d8764efb71cfeedddaeeb4719278d427737d" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.054831 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dfd4b94ec8f9116309b01b2ee85d8764efb71cfeedddaeeb4719278d427737d"} err="failed to get container status \"4dfd4b94ec8f9116309b01b2ee85d8764efb71cfeedddaeeb4719278d427737d\": rpc error: code = NotFound desc = could not find container \"4dfd4b94ec8f9116309b01b2ee85d8764efb71cfeedddaeeb4719278d427737d\": container with ID starting with 4dfd4b94ec8f9116309b01b2ee85d8764efb71cfeedddaeeb4719278d427737d not found: ID does not exist" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.054859 4594 scope.go:117] "RemoveContainer" containerID="2347658a2fcdc86c1e5d50acdfc002387af5984a58e5e1e05fbae56f821115a4" Nov 29 05:31:41 crc kubenswrapper[4594]: E1129 05:31:41.055143 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2347658a2fcdc86c1e5d50acdfc002387af5984a58e5e1e05fbae56f821115a4\": container with ID starting with 2347658a2fcdc86c1e5d50acdfc002387af5984a58e5e1e05fbae56f821115a4 not found: ID does not exist" containerID="2347658a2fcdc86c1e5d50acdfc002387af5984a58e5e1e05fbae56f821115a4" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.055168 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2347658a2fcdc86c1e5d50acdfc002387af5984a58e5e1e05fbae56f821115a4"} err="failed to get container status \"2347658a2fcdc86c1e5d50acdfc002387af5984a58e5e1e05fbae56f821115a4\": rpc error: code = NotFound desc = could not find container \"2347658a2fcdc86c1e5d50acdfc002387af5984a58e5e1e05fbae56f821115a4\": container with ID starting with 2347658a2fcdc86c1e5d50acdfc002387af5984a58e5e1e05fbae56f821115a4 not found: ID does not exist" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.055186 4594 scope.go:117] "RemoveContainer" containerID="df05ca65761e9cfdcf18fb9d190e6f8ec60d6b2759f454a6afa502e17bc13401" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.066195 4594 scope.go:117] "RemoveContainer" containerID="1e5b7774d662e70c16f62aff22b498aa3731e80ff00ed0ee8b12169402632ee3" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.079029 4594 scope.go:117] "RemoveContainer" containerID="05205bd34a1bb15a74e40a53e8ed80350765840b945e6338da88a923da116863" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.095363 4594 scope.go:117] "RemoveContainer" containerID="df05ca65761e9cfdcf18fb9d190e6f8ec60d6b2759f454a6afa502e17bc13401" Nov 29 05:31:41 crc kubenswrapper[4594]: E1129 05:31:41.095780 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df05ca65761e9cfdcf18fb9d190e6f8ec60d6b2759f454a6afa502e17bc13401\": container with ID starting with df05ca65761e9cfdcf18fb9d190e6f8ec60d6b2759f454a6afa502e17bc13401 not found: ID does not exist" containerID="df05ca65761e9cfdcf18fb9d190e6f8ec60d6b2759f454a6afa502e17bc13401" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.095816 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df05ca65761e9cfdcf18fb9d190e6f8ec60d6b2759f454a6afa502e17bc13401"} err="failed to get container status \"df05ca65761e9cfdcf18fb9d190e6f8ec60d6b2759f454a6afa502e17bc13401\": rpc error: code = NotFound desc = could not find container \"df05ca65761e9cfdcf18fb9d190e6f8ec60d6b2759f454a6afa502e17bc13401\": container with ID starting with df05ca65761e9cfdcf18fb9d190e6f8ec60d6b2759f454a6afa502e17bc13401 not found: ID does not exist" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.095835 4594 scope.go:117] "RemoveContainer" containerID="1e5b7774d662e70c16f62aff22b498aa3731e80ff00ed0ee8b12169402632ee3" Nov 29 05:31:41 crc kubenswrapper[4594]: E1129 05:31:41.096117 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e5b7774d662e70c16f62aff22b498aa3731e80ff00ed0ee8b12169402632ee3\": container with ID starting with 1e5b7774d662e70c16f62aff22b498aa3731e80ff00ed0ee8b12169402632ee3 not found: ID does not exist" containerID="1e5b7774d662e70c16f62aff22b498aa3731e80ff00ed0ee8b12169402632ee3" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.096152 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5b7774d662e70c16f62aff22b498aa3731e80ff00ed0ee8b12169402632ee3"} err="failed to get container status \"1e5b7774d662e70c16f62aff22b498aa3731e80ff00ed0ee8b12169402632ee3\": rpc error: code = NotFound desc = could not find container \"1e5b7774d662e70c16f62aff22b498aa3731e80ff00ed0ee8b12169402632ee3\": container with ID starting with 1e5b7774d662e70c16f62aff22b498aa3731e80ff00ed0ee8b12169402632ee3 not found: ID does not exist" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.096171 4594 scope.go:117] "RemoveContainer" containerID="05205bd34a1bb15a74e40a53e8ed80350765840b945e6338da88a923da116863" Nov 29 05:31:41 crc kubenswrapper[4594]: E1129 05:31:41.096775 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05205bd34a1bb15a74e40a53e8ed80350765840b945e6338da88a923da116863\": container with ID starting with 05205bd34a1bb15a74e40a53e8ed80350765840b945e6338da88a923da116863 not found: ID does not exist" containerID="05205bd34a1bb15a74e40a53e8ed80350765840b945e6338da88a923da116863" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.096800 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05205bd34a1bb15a74e40a53e8ed80350765840b945e6338da88a923da116863"} err="failed to get container status \"05205bd34a1bb15a74e40a53e8ed80350765840b945e6338da88a923da116863\": rpc error: code = NotFound desc = could not find container \"05205bd34a1bb15a74e40a53e8ed80350765840b945e6338da88a923da116863\": container with ID starting with 05205bd34a1bb15a74e40a53e8ed80350765840b945e6338da88a923da116863 not found: ID does not exist" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.096817 4594 scope.go:117] "RemoveContainer" containerID="6b3c986dad39b8ebccd65bd98dba64461672fe866c1e13346ea50c44df6f2b37" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.109976 4594 scope.go:117] "RemoveContainer" containerID="138846ff53c4a760ac11ede014a43aa729c0cdfbbf0e6c2a7b82687322faabb6" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.122766 4594 scope.go:117] "RemoveContainer" containerID="7f7e5489c6ab32c0b9d48a930698798fd30d71cc5150bd422ecf68fbabf67b87" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.135925 4594 scope.go:117] "RemoveContainer" containerID="6b3c986dad39b8ebccd65bd98dba64461672fe866c1e13346ea50c44df6f2b37" Nov 29 05:31:41 crc kubenswrapper[4594]: E1129 05:31:41.136292 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b3c986dad39b8ebccd65bd98dba64461672fe866c1e13346ea50c44df6f2b37\": container with ID starting with 6b3c986dad39b8ebccd65bd98dba64461672fe866c1e13346ea50c44df6f2b37 not found: ID does not exist" containerID="6b3c986dad39b8ebccd65bd98dba64461672fe866c1e13346ea50c44df6f2b37" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.136333 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3c986dad39b8ebccd65bd98dba64461672fe866c1e13346ea50c44df6f2b37"} err="failed to get container status \"6b3c986dad39b8ebccd65bd98dba64461672fe866c1e13346ea50c44df6f2b37\": rpc error: code = NotFound desc = could not find container \"6b3c986dad39b8ebccd65bd98dba64461672fe866c1e13346ea50c44df6f2b37\": container with ID starting with 6b3c986dad39b8ebccd65bd98dba64461672fe866c1e13346ea50c44df6f2b37 not found: ID does not exist" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.136363 4594 scope.go:117] "RemoveContainer" containerID="138846ff53c4a760ac11ede014a43aa729c0cdfbbf0e6c2a7b82687322faabb6" Nov 29 05:31:41 crc kubenswrapper[4594]: E1129 05:31:41.136664 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"138846ff53c4a760ac11ede014a43aa729c0cdfbbf0e6c2a7b82687322faabb6\": container with ID starting with 138846ff53c4a760ac11ede014a43aa729c0cdfbbf0e6c2a7b82687322faabb6 not found: ID does not exist" containerID="138846ff53c4a760ac11ede014a43aa729c0cdfbbf0e6c2a7b82687322faabb6" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.136696 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"138846ff53c4a760ac11ede014a43aa729c0cdfbbf0e6c2a7b82687322faabb6"} err="failed to get container status \"138846ff53c4a760ac11ede014a43aa729c0cdfbbf0e6c2a7b82687322faabb6\": rpc error: code = NotFound desc = could not find container \"138846ff53c4a760ac11ede014a43aa729c0cdfbbf0e6c2a7b82687322faabb6\": container with ID starting with 138846ff53c4a760ac11ede014a43aa729c0cdfbbf0e6c2a7b82687322faabb6 not found: ID does not exist" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.136714 4594 scope.go:117] "RemoveContainer" containerID="7f7e5489c6ab32c0b9d48a930698798fd30d71cc5150bd422ecf68fbabf67b87" Nov 29 05:31:41 crc kubenswrapper[4594]: E1129 05:31:41.136974 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f7e5489c6ab32c0b9d48a930698798fd30d71cc5150bd422ecf68fbabf67b87\": container with ID starting with 7f7e5489c6ab32c0b9d48a930698798fd30d71cc5150bd422ecf68fbabf67b87 not found: ID does not exist" containerID="7f7e5489c6ab32c0b9d48a930698798fd30d71cc5150bd422ecf68fbabf67b87" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.136997 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f7e5489c6ab32c0b9d48a930698798fd30d71cc5150bd422ecf68fbabf67b87"} err="failed to get container status \"7f7e5489c6ab32c0b9d48a930698798fd30d71cc5150bd422ecf68fbabf67b87\": rpc error: code = NotFound desc = could not find container \"7f7e5489c6ab32c0b9d48a930698798fd30d71cc5150bd422ecf68fbabf67b87\": container with ID starting with 7f7e5489c6ab32c0b9d48a930698798fd30d71cc5150bd422ecf68fbabf67b87 not found: ID does not exist" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.948814 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clxws" event={"ID":"8b4bf1cb-440a-4e74-82c4-c122e9985bf3","Type":"ContainerStarted","Data":"e97f04b89272942fc4e9bdf722a117854f6088166c1561a909046f05508ed1e8"} Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.949239 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-clxws" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.953913 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-clxws" Nov 29 05:31:41 crc kubenswrapper[4594]: I1129 05:31:41.965057 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-clxws" podStartSLOduration=1.9650476700000001 podStartE2EDuration="1.96504767s" podCreationTimestamp="2025-11-29 05:31:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:31:41.963511565 +0000 UTC m=+226.204020785" watchObservedRunningTime="2025-11-29 05:31:41.96504767 +0000 UTC m=+226.205556890" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.088692 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a1effb-361a-4fcf-b9f1-f32692c38587" path="/var/lib/kubelet/pods/37a1effb-361a-4fcf-b9f1-f32692c38587/volumes" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.089301 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" path="/var/lib/kubelet/pods/75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6/volumes" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.089828 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ec39acd-e54b-4bf5-99e7-dae22ecfceda" path="/var/lib/kubelet/pods/7ec39acd-e54b-4bf5-99e7-dae22ecfceda/volumes" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.090793 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c584b692-db15-4724-a208-491507a8474e" path="/var/lib/kubelet/pods/c584b692-db15-4724-a208-491507a8474e/volumes" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.091201 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b3dc9b-0c66-4af2-977d-9264c559a827" path="/var/lib/kubelet/pods/c9b3dc9b-0c66-4af2-977d-9264c559a827/volumes" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266340 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bk4kp"] Nov 29 05:31:42 crc kubenswrapper[4594]: E1129 05:31:42.266520 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b3dc9b-0c66-4af2-977d-9264c559a827" containerName="extract-content" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266534 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b3dc9b-0c66-4af2-977d-9264c559a827" containerName="extract-content" Nov 29 05:31:42 crc kubenswrapper[4594]: E1129 05:31:42.266542 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b3dc9b-0c66-4af2-977d-9264c559a827" containerName="extract-utilities" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266548 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b3dc9b-0c66-4af2-977d-9264c559a827" containerName="extract-utilities" Nov 29 05:31:42 crc kubenswrapper[4594]: E1129 05:31:42.266559 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" containerName="extract-utilities" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266565 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" containerName="extract-utilities" Nov 29 05:31:42 crc kubenswrapper[4594]: E1129 05:31:42.266576 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a1effb-361a-4fcf-b9f1-f32692c38587" containerName="extract-utilities" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266582 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a1effb-361a-4fcf-b9f1-f32692c38587" containerName="extract-utilities" Nov 29 05:31:42 crc kubenswrapper[4594]: E1129 05:31:42.266591 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a1effb-361a-4fcf-b9f1-f32692c38587" containerName="extract-content" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266597 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a1effb-361a-4fcf-b9f1-f32692c38587" containerName="extract-content" Nov 29 05:31:42 crc kubenswrapper[4594]: E1129 05:31:42.266606 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a1effb-361a-4fcf-b9f1-f32692c38587" containerName="registry-server" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266612 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a1effb-361a-4fcf-b9f1-f32692c38587" containerName="registry-server" Nov 29 05:31:42 crc kubenswrapper[4594]: E1129 05:31:42.266620 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c584b692-db15-4724-a208-491507a8474e" containerName="marketplace-operator" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266625 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c584b692-db15-4724-a208-491507a8474e" containerName="marketplace-operator" Nov 29 05:31:42 crc kubenswrapper[4594]: E1129 05:31:42.266631 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec39acd-e54b-4bf5-99e7-dae22ecfceda" containerName="extract-utilities" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266636 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec39acd-e54b-4bf5-99e7-dae22ecfceda" containerName="extract-utilities" Nov 29 05:31:42 crc kubenswrapper[4594]: E1129 05:31:42.266643 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" containerName="extract-content" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266648 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" containerName="extract-content" Nov 29 05:31:42 crc kubenswrapper[4594]: E1129 05:31:42.266655 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec39acd-e54b-4bf5-99e7-dae22ecfceda" containerName="registry-server" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266660 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec39acd-e54b-4bf5-99e7-dae22ecfceda" containerName="registry-server" Nov 29 05:31:42 crc kubenswrapper[4594]: E1129 05:31:42.266667 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b3dc9b-0c66-4af2-977d-9264c559a827" containerName="registry-server" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266672 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b3dc9b-0c66-4af2-977d-9264c559a827" containerName="registry-server" Nov 29 05:31:42 crc kubenswrapper[4594]: E1129 05:31:42.266681 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec39acd-e54b-4bf5-99e7-dae22ecfceda" containerName="extract-content" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266686 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec39acd-e54b-4bf5-99e7-dae22ecfceda" containerName="extract-content" Nov 29 05:31:42 crc kubenswrapper[4594]: E1129 05:31:42.266693 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" containerName="registry-server" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266697 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" containerName="registry-server" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266778 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a1effb-361a-4fcf-b9f1-f32692c38587" containerName="registry-server" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266789 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="c584b692-db15-4724-a208-491507a8474e" containerName="marketplace-operator" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266800 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec39acd-e54b-4bf5-99e7-dae22ecfceda" containerName="registry-server" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266808 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b3dc9b-0c66-4af2-977d-9264c559a827" containerName="registry-server" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.266816 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="75b1173d-e8e6-45a7-bd9a-4a3cfafbb8a6" containerName="registry-server" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.267447 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.268718 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.278836 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bk4kp"] Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.445983 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bf183a2-75c4-4800-a888-f41d978b1c1d-catalog-content\") pod \"redhat-marketplace-bk4kp\" (UID: \"9bf183a2-75c4-4800-a888-f41d978b1c1d\") " pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.446065 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pzmq\" (UniqueName: \"kubernetes.io/projected/9bf183a2-75c4-4800-a888-f41d978b1c1d-kube-api-access-6pzmq\") pod \"redhat-marketplace-bk4kp\" (UID: \"9bf183a2-75c4-4800-a888-f41d978b1c1d\") " pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.446167 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bf183a2-75c4-4800-a888-f41d978b1c1d-utilities\") pod \"redhat-marketplace-bk4kp\" (UID: \"9bf183a2-75c4-4800-a888-f41d978b1c1d\") " pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.461431 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9cznn"] Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.462734 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.467197 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.475115 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9cznn"] Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.547287 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bf183a2-75c4-4800-a888-f41d978b1c1d-catalog-content\") pod \"redhat-marketplace-bk4kp\" (UID: \"9bf183a2-75c4-4800-a888-f41d978b1c1d\") " pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.547350 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pzmq\" (UniqueName: \"kubernetes.io/projected/9bf183a2-75c4-4800-a888-f41d978b1c1d-kube-api-access-6pzmq\") pod \"redhat-marketplace-bk4kp\" (UID: \"9bf183a2-75c4-4800-a888-f41d978b1c1d\") " pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.547385 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bf183a2-75c4-4800-a888-f41d978b1c1d-utilities\") pod \"redhat-marketplace-bk4kp\" (UID: \"9bf183a2-75c4-4800-a888-f41d978b1c1d\") " pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.547727 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bf183a2-75c4-4800-a888-f41d978b1c1d-catalog-content\") pod \"redhat-marketplace-bk4kp\" (UID: \"9bf183a2-75c4-4800-a888-f41d978b1c1d\") " pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.547754 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bf183a2-75c4-4800-a888-f41d978b1c1d-utilities\") pod \"redhat-marketplace-bk4kp\" (UID: \"9bf183a2-75c4-4800-a888-f41d978b1c1d\") " pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.563157 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pzmq\" (UniqueName: \"kubernetes.io/projected/9bf183a2-75c4-4800-a888-f41d978b1c1d-kube-api-access-6pzmq\") pod \"redhat-marketplace-bk4kp\" (UID: \"9bf183a2-75c4-4800-a888-f41d978b1c1d\") " pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.585900 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.649647 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4s4z\" (UniqueName: \"kubernetes.io/projected/2f6c6bde-10f1-4dbf-863a-153a95f825b7-kube-api-access-w4s4z\") pod \"redhat-operators-9cznn\" (UID: \"2f6c6bde-10f1-4dbf-863a-153a95f825b7\") " pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.649900 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f6c6bde-10f1-4dbf-863a-153a95f825b7-utilities\") pod \"redhat-operators-9cznn\" (UID: \"2f6c6bde-10f1-4dbf-863a-153a95f825b7\") " pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.649966 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f6c6bde-10f1-4dbf-863a-153a95f825b7-catalog-content\") pod \"redhat-operators-9cznn\" (UID: \"2f6c6bde-10f1-4dbf-863a-153a95f825b7\") " pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.751708 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f6c6bde-10f1-4dbf-863a-153a95f825b7-utilities\") pod \"redhat-operators-9cznn\" (UID: \"2f6c6bde-10f1-4dbf-863a-153a95f825b7\") " pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.751758 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f6c6bde-10f1-4dbf-863a-153a95f825b7-catalog-content\") pod \"redhat-operators-9cznn\" (UID: \"2f6c6bde-10f1-4dbf-863a-153a95f825b7\") " pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.752161 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f6c6bde-10f1-4dbf-863a-153a95f825b7-utilities\") pod \"redhat-operators-9cznn\" (UID: \"2f6c6bde-10f1-4dbf-863a-153a95f825b7\") " pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.752618 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4s4z\" (UniqueName: \"kubernetes.io/projected/2f6c6bde-10f1-4dbf-863a-153a95f825b7-kube-api-access-w4s4z\") pod \"redhat-operators-9cznn\" (UID: \"2f6c6bde-10f1-4dbf-863a-153a95f825b7\") " pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.752631 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f6c6bde-10f1-4dbf-863a-153a95f825b7-catalog-content\") pod \"redhat-operators-9cznn\" (UID: \"2f6c6bde-10f1-4dbf-863a-153a95f825b7\") " pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.769655 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4s4z\" (UniqueName: \"kubernetes.io/projected/2f6c6bde-10f1-4dbf-863a-153a95f825b7-kube-api-access-w4s4z\") pod \"redhat-operators-9cznn\" (UID: \"2f6c6bde-10f1-4dbf-863a-153a95f825b7\") " pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.776536 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.923816 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9cznn"] Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.939529 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bk4kp"] Nov 29 05:31:42 crc kubenswrapper[4594]: W1129 05:31:42.945764 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bf183a2_75c4_4800_a888_f41d978b1c1d.slice/crio-d285bb920fb5719e0986fdb9deb861cf90bce6a08302178b2eacc5163ee93865 WatchSource:0}: Error finding container d285bb920fb5719e0986fdb9deb861cf90bce6a08302178b2eacc5163ee93865: Status 404 returned error can't find the container with id d285bb920fb5719e0986fdb9deb861cf90bce6a08302178b2eacc5163ee93865 Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.961817 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cznn" event={"ID":"2f6c6bde-10f1-4dbf-863a-153a95f825b7","Type":"ContainerStarted","Data":"b876afce96684d27914f0043bea2be2a905a83cd46a457b4ae72034b93c4f49b"} Nov 29 05:31:42 crc kubenswrapper[4594]: I1129 05:31:42.963424 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bk4kp" event={"ID":"9bf183a2-75c4-4800-a888-f41d978b1c1d","Type":"ContainerStarted","Data":"d285bb920fb5719e0986fdb9deb861cf90bce6a08302178b2eacc5163ee93865"} Nov 29 05:31:43 crc kubenswrapper[4594]: I1129 05:31:43.977025 4594 generic.go:334] "Generic (PLEG): container finished" podID="9bf183a2-75c4-4800-a888-f41d978b1c1d" containerID="81610f07f6a04f263dac8ff035caa2f1e9815934e637d9c9e97cd9222d54e46b" exitCode=0 Nov 29 05:31:43 crc kubenswrapper[4594]: I1129 05:31:43.977734 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bk4kp" event={"ID":"9bf183a2-75c4-4800-a888-f41d978b1c1d","Type":"ContainerDied","Data":"81610f07f6a04f263dac8ff035caa2f1e9815934e637d9c9e97cd9222d54e46b"} Nov 29 05:31:43 crc kubenswrapper[4594]: I1129 05:31:43.980162 4594 generic.go:334] "Generic (PLEG): container finished" podID="2f6c6bde-10f1-4dbf-863a-153a95f825b7" containerID="7b8bfec404483cf2430a0c724383c9f1014698d655e56f3128439b13f006a0bb" exitCode=0 Nov 29 05:31:43 crc kubenswrapper[4594]: I1129 05:31:43.980470 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cznn" event={"ID":"2f6c6bde-10f1-4dbf-863a-153a95f825b7","Type":"ContainerDied","Data":"7b8bfec404483cf2430a0c724383c9f1014698d655e56f3128439b13f006a0bb"} Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.667853 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kfzdq"] Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.669408 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.673359 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.676068 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfzdq"] Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.777290 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa62a8c9-aa8c-42d9-a634-f3aea0992e00-utilities\") pod \"certified-operators-kfzdq\" (UID: \"fa62a8c9-aa8c-42d9-a634-f3aea0992e00\") " pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.777588 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh5qt\" (UniqueName: \"kubernetes.io/projected/fa62a8c9-aa8c-42d9-a634-f3aea0992e00-kube-api-access-dh5qt\") pod \"certified-operators-kfzdq\" (UID: \"fa62a8c9-aa8c-42d9-a634-f3aea0992e00\") " pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.777623 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa62a8c9-aa8c-42d9-a634-f3aea0992e00-catalog-content\") pod \"certified-operators-kfzdq\" (UID: \"fa62a8c9-aa8c-42d9-a634-f3aea0992e00\") " pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.862112 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n4msp"] Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.863026 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.866034 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.873141 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n4msp"] Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.878623 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh5qt\" (UniqueName: \"kubernetes.io/projected/fa62a8c9-aa8c-42d9-a634-f3aea0992e00-kube-api-access-dh5qt\") pod \"certified-operators-kfzdq\" (UID: \"fa62a8c9-aa8c-42d9-a634-f3aea0992e00\") " pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.878672 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa62a8c9-aa8c-42d9-a634-f3aea0992e00-catalog-content\") pod \"certified-operators-kfzdq\" (UID: \"fa62a8c9-aa8c-42d9-a634-f3aea0992e00\") " pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.878716 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa62a8c9-aa8c-42d9-a634-f3aea0992e00-utilities\") pod \"certified-operators-kfzdq\" (UID: \"fa62a8c9-aa8c-42d9-a634-f3aea0992e00\") " pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.879103 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa62a8c9-aa8c-42d9-a634-f3aea0992e00-utilities\") pod \"certified-operators-kfzdq\" (UID: \"fa62a8c9-aa8c-42d9-a634-f3aea0992e00\") " pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.879557 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa62a8c9-aa8c-42d9-a634-f3aea0992e00-catalog-content\") pod \"certified-operators-kfzdq\" (UID: \"fa62a8c9-aa8c-42d9-a634-f3aea0992e00\") " pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.894852 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh5qt\" (UniqueName: \"kubernetes.io/projected/fa62a8c9-aa8c-42d9-a634-f3aea0992e00-kube-api-access-dh5qt\") pod \"certified-operators-kfzdq\" (UID: \"fa62a8c9-aa8c-42d9-a634-f3aea0992e00\") " pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.980001 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db9gt\" (UniqueName: \"kubernetes.io/projected/398463af-ebf5-4586-8a81-bc9b7cdce9c0-kube-api-access-db9gt\") pod \"community-operators-n4msp\" (UID: \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\") " pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.980061 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398463af-ebf5-4586-8a81-bc9b7cdce9c0-utilities\") pod \"community-operators-n4msp\" (UID: \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\") " pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.980280 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398463af-ebf5-4586-8a81-bc9b7cdce9c0-catalog-content\") pod \"community-operators-n4msp\" (UID: \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\") " pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.986355 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cznn" event={"ID":"2f6c6bde-10f1-4dbf-863a-153a95f825b7","Type":"ContainerStarted","Data":"d24ac5ef9525111eddf5fb05005f2249aeb0dc229561a0e975f5ad37262696f8"} Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.987945 4594 generic.go:334] "Generic (PLEG): container finished" podID="9bf183a2-75c4-4800-a888-f41d978b1c1d" containerID="44a7ff0ce9fcd6cb8c246af6154873b1b83200a95b6e513f11faa16b9340a5f1" exitCode=0 Nov 29 05:31:44 crc kubenswrapper[4594]: I1129 05:31:44.987993 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bk4kp" event={"ID":"9bf183a2-75c4-4800-a888-f41d978b1c1d","Type":"ContainerDied","Data":"44a7ff0ce9fcd6cb8c246af6154873b1b83200a95b6e513f11faa16b9340a5f1"} Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.008825 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.082350 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db9gt\" (UniqueName: \"kubernetes.io/projected/398463af-ebf5-4586-8a81-bc9b7cdce9c0-kube-api-access-db9gt\") pod \"community-operators-n4msp\" (UID: \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\") " pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.082544 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398463af-ebf5-4586-8a81-bc9b7cdce9c0-utilities\") pod \"community-operators-n4msp\" (UID: \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\") " pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.082593 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398463af-ebf5-4586-8a81-bc9b7cdce9c0-catalog-content\") pod \"community-operators-n4msp\" (UID: \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\") " pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.083043 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398463af-ebf5-4586-8a81-bc9b7cdce9c0-catalog-content\") pod \"community-operators-n4msp\" (UID: \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\") " pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.083131 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398463af-ebf5-4586-8a81-bc9b7cdce9c0-utilities\") pod \"community-operators-n4msp\" (UID: \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\") " pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.098508 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db9gt\" (UniqueName: \"kubernetes.io/projected/398463af-ebf5-4586-8a81-bc9b7cdce9c0-kube-api-access-db9gt\") pod \"community-operators-n4msp\" (UID: \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\") " pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.251791 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.371605 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfzdq"] Nov 29 05:31:45 crc kubenswrapper[4594]: W1129 05:31:45.381634 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa62a8c9_aa8c_42d9_a634_f3aea0992e00.slice/crio-5104b53137da3da138331da4aa1dfb6bf82dc1d9439b3785ccaeaede8de17a49 WatchSource:0}: Error finding container 5104b53137da3da138331da4aa1dfb6bf82dc1d9439b3785ccaeaede8de17a49: Status 404 returned error can't find the container with id 5104b53137da3da138331da4aa1dfb6bf82dc1d9439b3785ccaeaede8de17a49 Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.393969 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n4msp"] Nov 29 05:31:45 crc kubenswrapper[4594]: W1129 05:31:45.402923 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod398463af_ebf5_4586_8a81_bc9b7cdce9c0.slice/crio-487959b49f6406781968ec910e6ebb90842348f0573b4aa11bdb658f64713f12 WatchSource:0}: Error finding container 487959b49f6406781968ec910e6ebb90842348f0573b4aa11bdb658f64713f12: Status 404 returned error can't find the container with id 487959b49f6406781968ec910e6ebb90842348f0573b4aa11bdb658f64713f12 Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.800751 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.801040 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.801092 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.801748 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.801801 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5" gracePeriod=600 Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.994888 4594 generic.go:334] "Generic (PLEG): container finished" podID="2f6c6bde-10f1-4dbf-863a-153a95f825b7" containerID="d24ac5ef9525111eddf5fb05005f2249aeb0dc229561a0e975f5ad37262696f8" exitCode=0 Nov 29 05:31:45 crc kubenswrapper[4594]: I1129 05:31:45.994957 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cznn" event={"ID":"2f6c6bde-10f1-4dbf-863a-153a95f825b7","Type":"ContainerDied","Data":"d24ac5ef9525111eddf5fb05005f2249aeb0dc229561a0e975f5ad37262696f8"} Nov 29 05:31:46 crc kubenswrapper[4594]: I1129 05:31:45.999856 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bk4kp" event={"ID":"9bf183a2-75c4-4800-a888-f41d978b1c1d","Type":"ContainerStarted","Data":"9f13a23b821cc369a4d965889f34da7a8258f6c98a35ecf1816a950587510f0d"} Nov 29 05:31:46 crc kubenswrapper[4594]: I1129 05:31:46.001898 4594 generic.go:334] "Generic (PLEG): container finished" podID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" containerID="e6656096bebb07d98138098c03032af517ed8e4b6dc43d03d7a57dd38c8d61d6" exitCode=0 Nov 29 05:31:46 crc kubenswrapper[4594]: I1129 05:31:46.001941 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4msp" event={"ID":"398463af-ebf5-4586-8a81-bc9b7cdce9c0","Type":"ContainerDied","Data":"e6656096bebb07d98138098c03032af517ed8e4b6dc43d03d7a57dd38c8d61d6"} Nov 29 05:31:46 crc kubenswrapper[4594]: I1129 05:31:46.001957 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4msp" event={"ID":"398463af-ebf5-4586-8a81-bc9b7cdce9c0","Type":"ContainerStarted","Data":"487959b49f6406781968ec910e6ebb90842348f0573b4aa11bdb658f64713f12"} Nov 29 05:31:46 crc kubenswrapper[4594]: I1129 05:31:46.004595 4594 generic.go:334] "Generic (PLEG): container finished" podID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" containerID="708aded331816a646286ee76ef874fd1cd463940b45232a86dab0b17f46d11bf" exitCode=0 Nov 29 05:31:46 crc kubenswrapper[4594]: I1129 05:31:46.004637 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfzdq" event={"ID":"fa62a8c9-aa8c-42d9-a634-f3aea0992e00","Type":"ContainerDied","Data":"708aded331816a646286ee76ef874fd1cd463940b45232a86dab0b17f46d11bf"} Nov 29 05:31:46 crc kubenswrapper[4594]: I1129 05:31:46.004652 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfzdq" event={"ID":"fa62a8c9-aa8c-42d9-a634-f3aea0992e00","Type":"ContainerStarted","Data":"5104b53137da3da138331da4aa1dfb6bf82dc1d9439b3785ccaeaede8de17a49"} Nov 29 05:31:46 crc kubenswrapper[4594]: I1129 05:31:46.007376 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5" exitCode=0 Nov 29 05:31:46 crc kubenswrapper[4594]: I1129 05:31:46.007403 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5"} Nov 29 05:31:46 crc kubenswrapper[4594]: I1129 05:31:46.065242 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bk4kp" podStartSLOduration=2.467925769 podStartE2EDuration="4.065227498s" podCreationTimestamp="2025-11-29 05:31:42 +0000 UTC" firstStartedPulling="2025-11-29 05:31:43.979693672 +0000 UTC m=+228.220202893" lastFinishedPulling="2025-11-29 05:31:45.576995402 +0000 UTC m=+229.817504622" observedRunningTime="2025-11-29 05:31:46.065204285 +0000 UTC m=+230.305713505" watchObservedRunningTime="2025-11-29 05:31:46.065227498 +0000 UTC m=+230.305736718" Nov 29 05:31:47 crc kubenswrapper[4594]: I1129 05:31:47.015595 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"8ca701b866c00b1fe102c52d2dacfbb9215d913bbee855fff76b7d8b7e0bf21d"} Nov 29 05:31:48 crc kubenswrapper[4594]: I1129 05:31:48.025459 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cznn" event={"ID":"2f6c6bde-10f1-4dbf-863a-153a95f825b7","Type":"ContainerStarted","Data":"7781fc9b4821d03c96117f9afb4468e7c10187efd407d7ce7a93a4cdde9c4731"} Nov 29 05:31:48 crc kubenswrapper[4594]: I1129 05:31:48.028453 4594 generic.go:334] "Generic (PLEG): container finished" podID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" containerID="63478de88d9dea415d385132bb93a6fb43f25f25f1a0cde8624603e685ca456e" exitCode=0 Nov 29 05:31:48 crc kubenswrapper[4594]: I1129 05:31:48.028584 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4msp" event={"ID":"398463af-ebf5-4586-8a81-bc9b7cdce9c0","Type":"ContainerDied","Data":"63478de88d9dea415d385132bb93a6fb43f25f25f1a0cde8624603e685ca456e"} Nov 29 05:31:48 crc kubenswrapper[4594]: I1129 05:31:48.031923 4594 generic.go:334] "Generic (PLEG): container finished" podID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" containerID="915f75b14692adea389644e0e806df235fda6cdc79f0cc9917350f318ec37728" exitCode=0 Nov 29 05:31:48 crc kubenswrapper[4594]: I1129 05:31:48.031956 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfzdq" event={"ID":"fa62a8c9-aa8c-42d9-a634-f3aea0992e00","Type":"ContainerDied","Data":"915f75b14692adea389644e0e806df235fda6cdc79f0cc9917350f318ec37728"} Nov 29 05:31:48 crc kubenswrapper[4594]: I1129 05:31:48.045422 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9cznn" podStartSLOduration=3.460916209 podStartE2EDuration="6.045408864s" podCreationTimestamp="2025-11-29 05:31:42 +0000 UTC" firstStartedPulling="2025-11-29 05:31:43.984484894 +0000 UTC m=+228.224994113" lastFinishedPulling="2025-11-29 05:31:46.568977547 +0000 UTC m=+230.809486768" observedRunningTime="2025-11-29 05:31:48.043615043 +0000 UTC m=+232.284124253" watchObservedRunningTime="2025-11-29 05:31:48.045408864 +0000 UTC m=+232.285918083" Nov 29 05:31:49 crc kubenswrapper[4594]: I1129 05:31:49.041157 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4msp" event={"ID":"398463af-ebf5-4586-8a81-bc9b7cdce9c0","Type":"ContainerStarted","Data":"1e2b44cf933e93a3fe1935c83ebfa89fe04f0b8a9dc43cec070f6e4ac910802e"} Nov 29 05:31:49 crc kubenswrapper[4594]: I1129 05:31:49.044766 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfzdq" event={"ID":"fa62a8c9-aa8c-42d9-a634-f3aea0992e00","Type":"ContainerStarted","Data":"5ec6b9fb930774d77e8ed9afb6e97c1ea4accd8f37e9e823824be428b0b7a46c"} Nov 29 05:31:49 crc kubenswrapper[4594]: I1129 05:31:49.056268 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n4msp" podStartSLOduration=2.3003350830000002 podStartE2EDuration="5.056243156s" podCreationTimestamp="2025-11-29 05:31:44 +0000 UTC" firstStartedPulling="2025-11-29 05:31:46.002844212 +0000 UTC m=+230.243353432" lastFinishedPulling="2025-11-29 05:31:48.758752285 +0000 UTC m=+232.999261505" observedRunningTime="2025-11-29 05:31:49.054075483 +0000 UTC m=+233.294584704" watchObservedRunningTime="2025-11-29 05:31:49.056243156 +0000 UTC m=+233.296752376" Nov 29 05:31:49 crc kubenswrapper[4594]: I1129 05:31:49.072881 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kfzdq" podStartSLOduration=2.5090929749999997 podStartE2EDuration="5.072854954s" podCreationTimestamp="2025-11-29 05:31:44 +0000 UTC" firstStartedPulling="2025-11-29 05:31:46.005670833 +0000 UTC m=+230.246180052" lastFinishedPulling="2025-11-29 05:31:48.569432811 +0000 UTC m=+232.809942031" observedRunningTime="2025-11-29 05:31:49.070330802 +0000 UTC m=+233.310840012" watchObservedRunningTime="2025-11-29 05:31:49.072854954 +0000 UTC m=+233.313364174" Nov 29 05:31:52 crc kubenswrapper[4594]: I1129 05:31:52.586433 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:52 crc kubenswrapper[4594]: I1129 05:31:52.586756 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:52 crc kubenswrapper[4594]: I1129 05:31:52.622619 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:52 crc kubenswrapper[4594]: I1129 05:31:52.776782 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:52 crc kubenswrapper[4594]: I1129 05:31:52.776845 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:52 crc kubenswrapper[4594]: I1129 05:31:52.806797 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:53 crc kubenswrapper[4594]: I1129 05:31:53.100616 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9cznn" Nov 29 05:31:53 crc kubenswrapper[4594]: I1129 05:31:53.101026 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bk4kp" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.827411 4594 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.828604 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000" gracePeriod=15 Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.828753 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351" gracePeriod=15 Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.829044 4594 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.828799 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2" gracePeriod=15 Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.828821 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad" gracePeriod=15 Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.828817 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f" gracePeriod=15 Nov 29 05:31:54 crc kubenswrapper[4594]: E1129 05:31:54.829429 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.829984 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 29 05:31:54 crc kubenswrapper[4594]: E1129 05:31:54.830071 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.830121 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 29 05:31:54 crc kubenswrapper[4594]: E1129 05:31:54.830189 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.830284 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 29 05:31:54 crc kubenswrapper[4594]: E1129 05:31:54.830952 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.831063 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 05:31:54 crc kubenswrapper[4594]: E1129 05:31:54.831174 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.831268 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 29 05:31:54 crc kubenswrapper[4594]: E1129 05:31:54.831325 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.831373 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 05:31:54 crc kubenswrapper[4594]: E1129 05:31:54.831464 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.831517 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.831861 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.831981 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.832045 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.832095 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.832147 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.832477 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.833679 4594 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.834316 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:54 crc kubenswrapper[4594]: I1129 05:31:54.848304 4594 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Nov 29 05:31:54 crc kubenswrapper[4594]: E1129 05:31:54.876321 4594 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.25.120:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.010346 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.010421 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.014639 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.014674 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.014707 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.014736 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.014754 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.014843 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.014915 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.014996 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.042675 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.043157 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.083068 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.084179 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.085095 4594 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351" exitCode=0 Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.085127 4594 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad" exitCode=0 Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.085137 4594 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2" exitCode=0 Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.085148 4594 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f" exitCode=2 Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.085183 4594 scope.go:117] "RemoveContainer" containerID="3ba49c3a694bc245ce18b54b74156e6e7aaccd4fcf707264a278f86f52943392" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.086696 4594 generic.go:334] "Generic (PLEG): container finished" podID="031260c4-ed19-4221-b30f-d03a4abb131f" containerID="98422c59ed93ed10ba2e043b866c620d8e1b8180094575d0f33089b2beecd1ca" exitCode=0 Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.086773 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"031260c4-ed19-4221-b30f-d03a4abb131f","Type":"ContainerDied","Data":"98422c59ed93ed10ba2e043b866c620d8e1b8180094575d0f33089b2beecd1ca"} Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.087509 4594 status_manager.go:851] "Failed to get status for pod" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.087923 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.115871 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.115919 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.115952 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.115972 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.115988 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.116009 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.116023 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.116037 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.116096 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.116125 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.116146 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.116167 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.116189 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.116078 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.116351 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.116473 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.117173 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kfzdq" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.117764 4594 status_manager.go:851] "Failed to get status for pod" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.118075 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.177472 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:55 crc kubenswrapper[4594]: E1129 05:31:55.196419 4594 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.25.120:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187c634810f8b67e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-29 05:31:55.195922046 +0000 UTC m=+239.436431265,LastTimestamp:2025-11-29 05:31:55.195922046 +0000 UTC m=+239.436431265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.252006 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.252329 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.279752 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.280369 4594 status_manager.go:851] "Failed to get status for pod" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" pod="openshift-marketplace/community-operators-n4msp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-n4msp\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.281046 4594 status_manager.go:851] "Failed to get status for pod" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:55 crc kubenswrapper[4594]: I1129 05:31:55.281237 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.084831 4594 status_manager.go:851] "Failed to get status for pod" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" pod="openshift-marketplace/community-operators-n4msp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-n4msp\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.085072 4594 status_manager.go:851] "Failed to get status for pod" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.085333 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.102883 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.105422 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"38aab5ae66eae5cac723593d965221c52600bc64d7798f2039061e0749a74df1"} Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.105462 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c81bda345ab5cda476d119653006e4a8783e22d782880c24659ca4a972ceb914"} Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.106857 4594 status_manager.go:851] "Failed to get status for pod" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" pod="openshift-marketplace/community-operators-n4msp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-n4msp\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:56 crc kubenswrapper[4594]: E1129 05:31:56.107036 4594 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.25.120:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.107201 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.107573 4594 status_manager.go:851] "Failed to get status for pod" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.135163 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.135635 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.135897 4594 status_manager.go:851] "Failed to get status for pod" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.136051 4594 status_manager.go:851] "Failed to get status for pod" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" pod="openshift-marketplace/community-operators-n4msp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-n4msp\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.314601 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.316388 4594 status_manager.go:851] "Failed to get status for pod" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" pod="openshift-marketplace/community-operators-n4msp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-n4msp\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.316849 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.317223 4594 status_manager.go:851] "Failed to get status for pod" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.434474 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/031260c4-ed19-4221-b30f-d03a4abb131f-var-lock\") pod \"031260c4-ed19-4221-b30f-d03a4abb131f\" (UID: \"031260c4-ed19-4221-b30f-d03a4abb131f\") " Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.434703 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/031260c4-ed19-4221-b30f-d03a4abb131f-kube-api-access\") pod \"031260c4-ed19-4221-b30f-d03a4abb131f\" (UID: \"031260c4-ed19-4221-b30f-d03a4abb131f\") " Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.434738 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/031260c4-ed19-4221-b30f-d03a4abb131f-kubelet-dir\") pod \"031260c4-ed19-4221-b30f-d03a4abb131f\" (UID: \"031260c4-ed19-4221-b30f-d03a4abb131f\") " Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.434564 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/031260c4-ed19-4221-b30f-d03a4abb131f-var-lock" (OuterVolumeSpecName: "var-lock") pod "031260c4-ed19-4221-b30f-d03a4abb131f" (UID: "031260c4-ed19-4221-b30f-d03a4abb131f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.434977 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/031260c4-ed19-4221-b30f-d03a4abb131f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "031260c4-ed19-4221-b30f-d03a4abb131f" (UID: "031260c4-ed19-4221-b30f-d03a4abb131f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.436090 4594 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/031260c4-ed19-4221-b30f-d03a4abb131f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.436116 4594 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/031260c4-ed19-4221-b30f-d03a4abb131f-var-lock\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.441327 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031260c4-ed19-4221-b30f-d03a4abb131f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "031260c4-ed19-4221-b30f-d03a4abb131f" (UID: "031260c4-ed19-4221-b30f-d03a4abb131f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:31:56 crc kubenswrapper[4594]: I1129 05:31:56.537728 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/031260c4-ed19-4221-b30f-d03a4abb131f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.105275 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.106470 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.107117 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.107447 4594 status_manager.go:851] "Failed to get status for pod" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.107665 4594 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.107974 4594 status_manager.go:851] "Failed to get status for pod" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" pod="openshift-marketplace/community-operators-n4msp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-n4msp\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.112215 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.112229 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"031260c4-ed19-4221-b30f-d03a4abb131f","Type":"ContainerDied","Data":"21ef458b66b8cfdecb6b7e6be47d1b64816cba96af88aad276b644f9e5b94a95"} Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.112278 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21ef458b66b8cfdecb6b7e6be47d1b64816cba96af88aad276b644f9e5b94a95" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.115427 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.116287 4594 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000" exitCode=0 Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.116352 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.116402 4594 scope.go:117] "RemoveContainer" containerID="a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.123139 4594 status_manager.go:851] "Failed to get status for pod" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" pod="openshift-marketplace/community-operators-n4msp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-n4msp\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.123702 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.123932 4594 status_manager.go:851] "Failed to get status for pod" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.124289 4594 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.130582 4594 scope.go:117] "RemoveContainer" containerID="7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.145070 4594 scope.go:117] "RemoveContainer" containerID="7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.146967 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.147077 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.147284 4594 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.155445 4594 scope.go:117] "RemoveContainer" containerID="6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.166694 4594 scope.go:117] "RemoveContainer" containerID="583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.178963 4594 scope.go:117] "RemoveContainer" containerID="306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.194131 4594 scope.go:117] "RemoveContainer" containerID="a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351" Nov 29 05:31:57 crc kubenswrapper[4594]: E1129 05:31:57.194616 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\": container with ID starting with a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351 not found: ID does not exist" containerID="a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.194654 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351"} err="failed to get container status \"a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\": rpc error: code = NotFound desc = could not find container \"a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351\": container with ID starting with a87a0320d47b3753baeb865c7ef9e024c65a96555fdd71c0f2a23c8c0b80d351 not found: ID does not exist" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.194679 4594 scope.go:117] "RemoveContainer" containerID="7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad" Nov 29 05:31:57 crc kubenswrapper[4594]: E1129 05:31:57.195029 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\": container with ID starting with 7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad not found: ID does not exist" containerID="7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.195060 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad"} err="failed to get container status \"7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\": rpc error: code = NotFound desc = could not find container \"7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad\": container with ID starting with 7e070e0ea87ae0cfa8fcc6aadd13e7ae3c2ab3eb1ea3daf6178c9d35796ba3ad not found: ID does not exist" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.195080 4594 scope.go:117] "RemoveContainer" containerID="7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2" Nov 29 05:31:57 crc kubenswrapper[4594]: E1129 05:31:57.195424 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\": container with ID starting with 7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2 not found: ID does not exist" containerID="7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.195468 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2"} err="failed to get container status \"7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\": rpc error: code = NotFound desc = could not find container \"7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2\": container with ID starting with 7b8d3be68869fef8b377ce1d84e3edf447496a867c40e88d013ef0d1c5629de2 not found: ID does not exist" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.195502 4594 scope.go:117] "RemoveContainer" containerID="6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f" Nov 29 05:31:57 crc kubenswrapper[4594]: E1129 05:31:57.196500 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\": container with ID starting with 6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f not found: ID does not exist" containerID="6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.196532 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f"} err="failed to get container status \"6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\": rpc error: code = NotFound desc = could not find container \"6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f\": container with ID starting with 6655c8a5c56e4d2b3e8fbd88308d04ded2b41f7243bf90cb2dbbd6ad93f3153f not found: ID does not exist" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.196550 4594 scope.go:117] "RemoveContainer" containerID="583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000" Nov 29 05:31:57 crc kubenswrapper[4594]: E1129 05:31:57.196823 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\": container with ID starting with 583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000 not found: ID does not exist" containerID="583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.196855 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000"} err="failed to get container status \"583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\": rpc error: code = NotFound desc = could not find container \"583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000\": container with ID starting with 583ca6d59501822022d5d15a450d4403592d248bce7be4b22b1c7baef4e93000 not found: ID does not exist" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.196875 4594 scope.go:117] "RemoveContainer" containerID="306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699" Nov 29 05:31:57 crc kubenswrapper[4594]: E1129 05:31:57.197132 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\": container with ID starting with 306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699 not found: ID does not exist" containerID="306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.197159 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699"} err="failed to get container status \"306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\": rpc error: code = NotFound desc = could not find container \"306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699\": container with ID starting with 306c2ee6f8204da653775c897fd81f88c1df5e1d92da18a301843a3ab150a699 not found: ID does not exist" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.248608 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.248725 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.249062 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.249116 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.249599 4594 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.249675 4594 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.429078 4594 status_manager.go:851] "Failed to get status for pod" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" pod="openshift-marketplace/community-operators-n4msp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-n4msp\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.429476 4594 status_manager.go:851] "Failed to get status for pod" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.429720 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:57 crc kubenswrapper[4594]: I1129 05:31:57.429964 4594 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:31:58 crc kubenswrapper[4594]: I1129 05:31:58.094920 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 29 05:32:00 crc kubenswrapper[4594]: E1129 05:32:00.553221 4594 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.25.120:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187c634810f8b67e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-29 05:31:55.195922046 +0000 UTC m=+239.436431265,LastTimestamp:2025-11-29 05:31:55.195922046 +0000 UTC m=+239.436431265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 29 05:32:00 crc kubenswrapper[4594]: E1129 05:32:00.981111 4594 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:00 crc kubenswrapper[4594]: E1129 05:32:00.981597 4594 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:00 crc kubenswrapper[4594]: E1129 05:32:00.981889 4594 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:00 crc kubenswrapper[4594]: E1129 05:32:00.982229 4594 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:00 crc kubenswrapper[4594]: E1129 05:32:00.982465 4594 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:00 crc kubenswrapper[4594]: I1129 05:32:00.982487 4594 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 29 05:32:00 crc kubenswrapper[4594]: E1129 05:32:00.982687 4594 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" interval="200ms" Nov 29 05:32:01 crc kubenswrapper[4594]: E1129 05:32:01.183621 4594 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" interval="400ms" Nov 29 05:32:01 crc kubenswrapper[4594]: E1129 05:32:01.584510 4594 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" interval="800ms" Nov 29 05:32:01 crc kubenswrapper[4594]: E1129 05:32:01.902495 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:32:01Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:32:01Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:32:01Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T05:32:01Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:01 crc kubenswrapper[4594]: E1129 05:32:01.902695 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:01 crc kubenswrapper[4594]: E1129 05:32:01.902863 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:01 crc kubenswrapper[4594]: E1129 05:32:01.903021 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:01 crc kubenswrapper[4594]: E1129 05:32:01.903182 4594 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:01 crc kubenswrapper[4594]: E1129 05:32:01.903213 4594 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 05:32:02 crc kubenswrapper[4594]: E1129 05:32:02.385537 4594 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" interval="1.6s" Nov 29 05:32:03 crc kubenswrapper[4594]: E1129 05:32:03.986518 4594 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" interval="3.2s" Nov 29 05:32:06 crc kubenswrapper[4594]: I1129 05:32:06.082581 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:32:06 crc kubenswrapper[4594]: I1129 05:32:06.084510 4594 status_manager.go:851] "Failed to get status for pod" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" pod="openshift-marketplace/community-operators-n4msp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-n4msp\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:06 crc kubenswrapper[4594]: I1129 05:32:06.084782 4594 status_manager.go:851] "Failed to get status for pod" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:06 crc kubenswrapper[4594]: I1129 05:32:06.085009 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:06 crc kubenswrapper[4594]: I1129 05:32:06.085592 4594 status_manager.go:851] "Failed to get status for pod" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:06 crc kubenswrapper[4594]: I1129 05:32:06.085802 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:06 crc kubenswrapper[4594]: I1129 05:32:06.086023 4594 status_manager.go:851] "Failed to get status for pod" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" pod="openshift-marketplace/community-operators-n4msp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-n4msp\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:06 crc kubenswrapper[4594]: I1129 05:32:06.095142 4594 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1629901e-7133-4eb0-b634-5e458da9f205" Nov 29 05:32:06 crc kubenswrapper[4594]: I1129 05:32:06.095162 4594 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1629901e-7133-4eb0-b634-5e458da9f205" Nov 29 05:32:06 crc kubenswrapper[4594]: E1129 05:32:06.095434 4594 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:32:06 crc kubenswrapper[4594]: I1129 05:32:06.095840 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:32:06 crc kubenswrapper[4594]: W1129 05:32:06.108882 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-b5efeaba2dc76c3043402343e29940d4650974e180d4c251f34b87d8bdad2fd7 WatchSource:0}: Error finding container b5efeaba2dc76c3043402343e29940d4650974e180d4c251f34b87d8bdad2fd7: Status 404 returned error can't find the container with id b5efeaba2dc76c3043402343e29940d4650974e180d4c251f34b87d8bdad2fd7 Nov 29 05:32:06 crc kubenswrapper[4594]: I1129 05:32:06.153926 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b5efeaba2dc76c3043402343e29940d4650974e180d4c251f34b87d8bdad2fd7"} Nov 29 05:32:07 crc kubenswrapper[4594]: I1129 05:32:07.159338 4594 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="0568a3761c0a9a99d156fc9b3a953920df07948371f908daca8416e7d1f8c44c" exitCode=0 Nov 29 05:32:07 crc kubenswrapper[4594]: I1129 05:32:07.159374 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"0568a3761c0a9a99d156fc9b3a953920df07948371f908daca8416e7d1f8c44c"} Nov 29 05:32:07 crc kubenswrapper[4594]: I1129 05:32:07.159549 4594 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1629901e-7133-4eb0-b634-5e458da9f205" Nov 29 05:32:07 crc kubenswrapper[4594]: I1129 05:32:07.159572 4594 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1629901e-7133-4eb0-b634-5e458da9f205" Nov 29 05:32:07 crc kubenswrapper[4594]: E1129 05:32:07.159795 4594 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:32:07 crc kubenswrapper[4594]: I1129 05:32:07.159808 4594 status_manager.go:851] "Failed to get status for pod" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" pod="openshift-marketplace/community-operators-n4msp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-n4msp\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:07 crc kubenswrapper[4594]: I1129 05:32:07.160055 4594 status_manager.go:851] "Failed to get status for pod" podUID="fa62a8c9-aa8c-42d9-a634-f3aea0992e00" pod="openshift-marketplace/certified-operators-kfzdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kfzdq\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:07 crc kubenswrapper[4594]: I1129 05:32:07.160669 4594 status_manager.go:851] "Failed to get status for pod" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.120:6443: connect: connection refused" Nov 29 05:32:07 crc kubenswrapper[4594]: E1129 05:32:07.187841 4594 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.120:6443: connect: connection refused" interval="6.4s" Nov 29 05:32:08 crc kubenswrapper[4594]: I1129 05:32:08.166723 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 29 05:32:08 crc kubenswrapper[4594]: I1129 05:32:08.167053 4594 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228" exitCode=1 Nov 29 05:32:08 crc kubenswrapper[4594]: I1129 05:32:08.167117 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228"} Nov 29 05:32:08 crc kubenswrapper[4594]: I1129 05:32:08.167423 4594 scope.go:117] "RemoveContainer" containerID="a960cfe07ae731cc2d07957dba63be2897328d662dd2672bd61344c8baac0228" Nov 29 05:32:08 crc kubenswrapper[4594]: I1129 05:32:08.181650 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9139ed1cf567953af61f511719fd5babceedc3aa11c981cbaf9f3fa0d81640aa"} Nov 29 05:32:08 crc kubenswrapper[4594]: I1129 05:32:08.181697 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1e8d93088a9da6bc1530ac51f01a70301149928ae23b4453971e7c2d9e083e86"} Nov 29 05:32:08 crc kubenswrapper[4594]: I1129 05:32:08.181708 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"530cdb00456db8ae16fc88898934d1bc8248c6bad222cfdb2211b2e612288f1e"} Nov 29 05:32:08 crc kubenswrapper[4594]: I1129 05:32:08.181723 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5c9fabb30162be00d6dfa3b914369bb6578396d3e071e78fc6203bd08d199ad4"} Nov 29 05:32:08 crc kubenswrapper[4594]: I1129 05:32:08.181733 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d14553ddf24c06fc89f9c52e3caa848752b5b3baf9ae1a8ccc909d0be0e4eb0a"} Nov 29 05:32:08 crc kubenswrapper[4594]: I1129 05:32:08.181969 4594 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1629901e-7133-4eb0-b634-5e458da9f205" Nov 29 05:32:08 crc kubenswrapper[4594]: I1129 05:32:08.181989 4594 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1629901e-7133-4eb0-b634-5e458da9f205" Nov 29 05:32:08 crc kubenswrapper[4594]: I1129 05:32:08.182269 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:32:09 crc kubenswrapper[4594]: I1129 05:32:09.191315 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 29 05:32:09 crc kubenswrapper[4594]: I1129 05:32:09.191403 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"19bb351b050d76960d2049ec21adfaaf9a10c4204e608d18942de8f3b873a321"} Nov 29 05:32:11 crc kubenswrapper[4594]: I1129 05:32:11.095958 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:32:11 crc kubenswrapper[4594]: I1129 05:32:11.096166 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:32:11 crc kubenswrapper[4594]: I1129 05:32:11.099941 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:32:13 crc kubenswrapper[4594]: I1129 05:32:13.571280 4594 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:32:14 crc kubenswrapper[4594]: I1129 05:32:14.209562 4594 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1629901e-7133-4eb0-b634-5e458da9f205" Nov 29 05:32:14 crc kubenswrapper[4594]: I1129 05:32:14.209591 4594 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1629901e-7133-4eb0-b634-5e458da9f205" Nov 29 05:32:14 crc kubenswrapper[4594]: I1129 05:32:14.212517 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:32:14 crc kubenswrapper[4594]: I1129 05:32:14.382394 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:32:14 crc kubenswrapper[4594]: I1129 05:32:14.385018 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:32:15 crc kubenswrapper[4594]: I1129 05:32:15.213223 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:32:15 crc kubenswrapper[4594]: I1129 05:32:15.213489 4594 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1629901e-7133-4eb0-b634-5e458da9f205" Nov 29 05:32:15 crc kubenswrapper[4594]: I1129 05:32:15.213512 4594 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1629901e-7133-4eb0-b634-5e458da9f205" Nov 29 05:32:16 crc kubenswrapper[4594]: I1129 05:32:16.095237 4594 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1f7e20fc-d941-4612-a393-a1b07d545239" Nov 29 05:32:21 crc kubenswrapper[4594]: I1129 05:32:21.288753 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 29 05:32:23 crc kubenswrapper[4594]: I1129 05:32:23.299487 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 29 05:32:23 crc kubenswrapper[4594]: I1129 05:32:23.686146 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 29 05:32:25 crc kubenswrapper[4594]: I1129 05:32:25.026672 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 29 05:32:25 crc kubenswrapper[4594]: I1129 05:32:25.601218 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 29 05:32:26 crc kubenswrapper[4594]: I1129 05:32:26.011205 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 29 05:32:26 crc kubenswrapper[4594]: I1129 05:32:26.187279 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 29 05:32:26 crc kubenswrapper[4594]: I1129 05:32:26.359592 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 29 05:32:26 crc kubenswrapper[4594]: I1129 05:32:26.459024 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 29 05:32:26 crc kubenswrapper[4594]: I1129 05:32:26.475534 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 29 05:32:26 crc kubenswrapper[4594]: I1129 05:32:26.795438 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 29 05:32:26 crc kubenswrapper[4594]: I1129 05:32:26.829440 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 29 05:32:27 crc kubenswrapper[4594]: I1129 05:32:27.232603 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 29 05:32:27 crc kubenswrapper[4594]: I1129 05:32:27.329468 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 29 05:32:27 crc kubenswrapper[4594]: I1129 05:32:27.502078 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 29 05:32:27 crc kubenswrapper[4594]: I1129 05:32:27.507951 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 05:32:27 crc kubenswrapper[4594]: I1129 05:32:27.591824 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 29 05:32:27 crc kubenswrapper[4594]: I1129 05:32:27.655483 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 29 05:32:27 crc kubenswrapper[4594]: I1129 05:32:27.807121 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 29 05:32:27 crc kubenswrapper[4594]: I1129 05:32:27.907398 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 29 05:32:27 crc kubenswrapper[4594]: I1129 05:32:27.970941 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.090658 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.135081 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.161830 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.180866 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.187998 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.361915 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.377359 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.479577 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.489305 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.558946 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.647142 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.674677 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.694999 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.751539 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.881077 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.913580 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.921333 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 29 05:32:28 crc kubenswrapper[4594]: I1129 05:32:28.937491 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 29 05:32:29 crc kubenswrapper[4594]: I1129 05:32:29.088767 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 29 05:32:29 crc kubenswrapper[4594]: I1129 05:32:29.114939 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 29 05:32:29 crc kubenswrapper[4594]: I1129 05:32:29.159142 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 29 05:32:29 crc kubenswrapper[4594]: I1129 05:32:29.182034 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 29 05:32:29 crc kubenswrapper[4594]: I1129 05:32:29.562002 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 29 05:32:29 crc kubenswrapper[4594]: I1129 05:32:29.624414 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 29 05:32:29 crc kubenswrapper[4594]: I1129 05:32:29.627704 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 29 05:32:29 crc kubenswrapper[4594]: I1129 05:32:29.770216 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 29 05:32:29 crc kubenswrapper[4594]: I1129 05:32:29.785434 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 29 05:32:29 crc kubenswrapper[4594]: I1129 05:32:29.806357 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 29 05:32:29 crc kubenswrapper[4594]: I1129 05:32:29.901107 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 29 05:32:29 crc kubenswrapper[4594]: I1129 05:32:29.992152 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 29 05:32:30 crc kubenswrapper[4594]: I1129 05:32:30.085462 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 29 05:32:30 crc kubenswrapper[4594]: I1129 05:32:30.208664 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 29 05:32:30 crc kubenswrapper[4594]: I1129 05:32:30.217675 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 29 05:32:30 crc kubenswrapper[4594]: I1129 05:32:30.224512 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 29 05:32:30 crc kubenswrapper[4594]: I1129 05:32:30.290093 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 29 05:32:30 crc kubenswrapper[4594]: I1129 05:32:30.336016 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 29 05:32:30 crc kubenswrapper[4594]: I1129 05:32:30.536341 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 29 05:32:30 crc kubenswrapper[4594]: I1129 05:32:30.715522 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 29 05:32:30 crc kubenswrapper[4594]: I1129 05:32:30.828135 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 29 05:32:30 crc kubenswrapper[4594]: I1129 05:32:30.833316 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.156612 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.164611 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.176667 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.265308 4594 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.369336 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.407081 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.531138 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.750890 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.779778 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.788677 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.790279 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.846010 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.869935 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.929438 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 29 05:32:31 crc kubenswrapper[4594]: I1129 05:32:31.945503 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 29 05:32:32 crc kubenswrapper[4594]: I1129 05:32:32.026229 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 29 05:32:32 crc kubenswrapper[4594]: I1129 05:32:32.044656 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 29 05:32:32 crc kubenswrapper[4594]: I1129 05:32:32.104038 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 29 05:32:32 crc kubenswrapper[4594]: I1129 05:32:32.120322 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 29 05:32:32 crc kubenswrapper[4594]: I1129 05:32:32.202073 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 29 05:32:32 crc kubenswrapper[4594]: I1129 05:32:32.210435 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 29 05:32:32 crc kubenswrapper[4594]: I1129 05:32:32.234867 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 29 05:32:32 crc kubenswrapper[4594]: I1129 05:32:32.239588 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 29 05:32:32 crc kubenswrapper[4594]: I1129 05:32:32.525993 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 29 05:32:32 crc kubenswrapper[4594]: I1129 05:32:32.544461 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 29 05:32:32 crc kubenswrapper[4594]: I1129 05:32:32.687115 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 29 05:32:32 crc kubenswrapper[4594]: I1129 05:32:32.865067 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 29 05:32:32 crc kubenswrapper[4594]: I1129 05:32:32.935648 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.135766 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.153342 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.172471 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.185616 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.187436 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.188629 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.213550 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.282494 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.287360 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.295559 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.367042 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.383790 4594 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.387042 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.387090 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.391824 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.402752 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.402738848 podStartE2EDuration="20.402738848s" podCreationTimestamp="2025-11-29 05:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:32:33.400306715 +0000 UTC m=+277.640815935" watchObservedRunningTime="2025-11-29 05:32:33.402738848 +0000 UTC m=+277.643248068" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.404947 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.429239 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.431935 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.489279 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.519886 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.558241 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.604568 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.614579 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.690009 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.717774 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.758986 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.794074 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 29 05:32:33 crc kubenswrapper[4594]: I1129 05:32:33.859902 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.097015 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.106953 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.122587 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.149440 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.166094 4594 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.166326 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.190520 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.291223 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.485390 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.513430 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.526710 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.540875 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.545446 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.638584 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.687187 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.694067 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.740302 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.795019 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.803788 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.820577 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.837665 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.962313 4594 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 29 05:32:34 crc kubenswrapper[4594]: I1129 05:32:34.962504 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://38aab5ae66eae5cac723593d965221c52600bc64d7798f2039061e0749a74df1" gracePeriod=5 Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.006832 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.078471 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.111116 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.145231 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.187165 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.194383 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.329032 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.390123 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.426237 4594 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.433130 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.585843 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.662247 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.699429 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.712670 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.749967 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.753785 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.762221 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.777643 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.778276 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.804799 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.808516 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.944935 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 29 05:32:35 crc kubenswrapper[4594]: I1129 05:32:35.997417 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.038109 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.047190 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.071321 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.107461 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.119808 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.161869 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.171586 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.286747 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.302080 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.307119 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.340868 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.470344 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.484123 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.504829 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.569172 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.675699 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.692632 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.701759 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.702623 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.741656 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.761905 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 29 05:32:36 crc kubenswrapper[4594]: I1129 05:32:36.853599 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 29 05:32:37 crc kubenswrapper[4594]: I1129 05:32:37.003416 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 29 05:32:37 crc kubenswrapper[4594]: I1129 05:32:37.222642 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 29 05:32:37 crc kubenswrapper[4594]: I1129 05:32:37.223048 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 29 05:32:37 crc kubenswrapper[4594]: I1129 05:32:37.345618 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 29 05:32:37 crc kubenswrapper[4594]: I1129 05:32:37.364482 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 29 05:32:37 crc kubenswrapper[4594]: I1129 05:32:37.399972 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 29 05:32:37 crc kubenswrapper[4594]: I1129 05:32:37.585170 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 29 05:32:37 crc kubenswrapper[4594]: I1129 05:32:37.700345 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 29 05:32:37 crc kubenswrapper[4594]: I1129 05:32:37.796958 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 29 05:32:37 crc kubenswrapper[4594]: I1129 05:32:37.807692 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 29 05:32:37 crc kubenswrapper[4594]: I1129 05:32:37.875686 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 29 05:32:37 crc kubenswrapper[4594]: I1129 05:32:37.908744 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 29 05:32:37 crc kubenswrapper[4594]: I1129 05:32:37.919296 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.013777 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.072395 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.191719 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.195018 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.302733 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.353082 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.401146 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.401344 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.440016 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.454963 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.462376 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.468179 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.591190 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.691304 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.728473 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.760327 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.779632 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.805345 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.840787 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.910006 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.930769 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 29 05:32:38 crc kubenswrapper[4594]: I1129 05:32:38.970839 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 29 05:32:39 crc kubenswrapper[4594]: I1129 05:32:39.008041 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 29 05:32:39 crc kubenswrapper[4594]: I1129 05:32:39.143499 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 29 05:32:39 crc kubenswrapper[4594]: I1129 05:32:39.219299 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 29 05:32:39 crc kubenswrapper[4594]: I1129 05:32:39.220083 4594 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 29 05:32:39 crc kubenswrapper[4594]: I1129 05:32:39.285111 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 29 05:32:39 crc kubenswrapper[4594]: I1129 05:32:39.407075 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 29 05:32:39 crc kubenswrapper[4594]: I1129 05:32:39.409666 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 29 05:32:39 crc kubenswrapper[4594]: I1129 05:32:39.415305 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 29 05:32:39 crc kubenswrapper[4594]: I1129 05:32:39.604560 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 29 05:32:39 crc kubenswrapper[4594]: I1129 05:32:39.763883 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 29 05:32:39 crc kubenswrapper[4594]: I1129 05:32:39.914911 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 29 05:32:39 crc kubenswrapper[4594]: I1129 05:32:39.920543 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.149284 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.178502 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.309533 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.309762 4594 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="38aab5ae66eae5cac723593d965221c52600bc64d7798f2039061e0749a74df1" exitCode=137 Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.358888 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.473507 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.495346 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.507612 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.517734 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.517808 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.539589 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.573665 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.657954 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.672607 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.673533 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.681678 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.697960 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.698045 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.698066 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.698082 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.698101 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.698123 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.698199 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.698201 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.698276 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.698461 4594 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.698474 4594 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.698483 4594 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.698491 4594 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.708093 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.799307 4594 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.882372 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 29 05:32:40 crc kubenswrapper[4594]: I1129 05:32:40.955923 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 29 05:32:41 crc kubenswrapper[4594]: I1129 05:32:41.051858 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 29 05:32:41 crc kubenswrapper[4594]: I1129 05:32:41.285156 4594 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 29 05:32:41 crc kubenswrapper[4594]: I1129 05:32:41.328715 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 29 05:32:41 crc kubenswrapper[4594]: I1129 05:32:41.328783 4594 scope.go:117] "RemoveContainer" containerID="38aab5ae66eae5cac723593d965221c52600bc64d7798f2039061e0749a74df1" Nov 29 05:32:41 crc kubenswrapper[4594]: I1129 05:32:41.328851 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 05:32:41 crc kubenswrapper[4594]: I1129 05:32:41.375483 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 29 05:32:41 crc kubenswrapper[4594]: I1129 05:32:41.396225 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 29 05:32:41 crc kubenswrapper[4594]: I1129 05:32:41.507754 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 29 05:32:41 crc kubenswrapper[4594]: I1129 05:32:41.538016 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 29 05:32:41 crc kubenswrapper[4594]: I1129 05:32:41.878151 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 29 05:32:42 crc kubenswrapper[4594]: I1129 05:32:42.090227 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 29 05:32:42 crc kubenswrapper[4594]: I1129 05:32:42.099333 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 29 05:32:42 crc kubenswrapper[4594]: I1129 05:32:42.271740 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 29 05:32:43 crc kubenswrapper[4594]: I1129 05:32:43.626994 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 29 05:32:43 crc kubenswrapper[4594]: I1129 05:32:43.685908 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 29 05:32:43 crc kubenswrapper[4594]: I1129 05:32:43.875623 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.296043 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-scxzk"] Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.296693 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" podUID="864d1c0b-3c85-4472-9d16-c8d5c574e02a" containerName="controller-manager" containerID="cri-o://4ef591495522298ee55bf70323d52dd483bbe98056d92534ed3a10c74ff2f22d" gracePeriod=30 Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.298556 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6"] Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.298670 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" podUID="652260eb-911a-4b92-8cea-73c9e8f156c4" containerName="route-controller-manager" containerID="cri-o://0e0e3257b5f508dae9810188a0ad1ca6db901a70732dcc72b62da7e20d1eb528" gracePeriod=30 Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.462793 4594 generic.go:334] "Generic (PLEG): container finished" podID="864d1c0b-3c85-4472-9d16-c8d5c574e02a" containerID="4ef591495522298ee55bf70323d52dd483bbe98056d92534ed3a10c74ff2f22d" exitCode=0 Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.462863 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" event={"ID":"864d1c0b-3c85-4472-9d16-c8d5c574e02a","Type":"ContainerDied","Data":"4ef591495522298ee55bf70323d52dd483bbe98056d92534ed3a10c74ff2f22d"} Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.466052 4594 generic.go:334] "Generic (PLEG): container finished" podID="652260eb-911a-4b92-8cea-73c9e8f156c4" containerID="0e0e3257b5f508dae9810188a0ad1ca6db901a70732dcc72b62da7e20d1eb528" exitCode=0 Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.466085 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" event={"ID":"652260eb-911a-4b92-8cea-73c9e8f156c4","Type":"ContainerDied","Data":"0e0e3257b5f508dae9810188a0ad1ca6db901a70732dcc72b62da7e20d1eb528"} Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.601423 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.605279 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.738934 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxbv4\" (UniqueName: \"kubernetes.io/projected/864d1c0b-3c85-4472-9d16-c8d5c574e02a-kube-api-access-zxbv4\") pod \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.739109 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mvjv\" (UniqueName: \"kubernetes.io/projected/652260eb-911a-4b92-8cea-73c9e8f156c4-kube-api-access-7mvjv\") pod \"652260eb-911a-4b92-8cea-73c9e8f156c4\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.739192 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/652260eb-911a-4b92-8cea-73c9e8f156c4-config\") pod \"652260eb-911a-4b92-8cea-73c9e8f156c4\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.739296 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/864d1c0b-3c85-4472-9d16-c8d5c574e02a-serving-cert\") pod \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.739379 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-proxy-ca-bundles\") pod \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.739444 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/652260eb-911a-4b92-8cea-73c9e8f156c4-client-ca\") pod \"652260eb-911a-4b92-8cea-73c9e8f156c4\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.739510 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/652260eb-911a-4b92-8cea-73c9e8f156c4-serving-cert\") pod \"652260eb-911a-4b92-8cea-73c9e8f156c4\" (UID: \"652260eb-911a-4b92-8cea-73c9e8f156c4\") " Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.739590 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-config\") pod \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.739735 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-client-ca\") pod \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\" (UID: \"864d1c0b-3c85-4472-9d16-c8d5c574e02a\") " Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.740180 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "864d1c0b-3c85-4472-9d16-c8d5c574e02a" (UID: "864d1c0b-3c85-4472-9d16-c8d5c574e02a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.740209 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-client-ca" (OuterVolumeSpecName: "client-ca") pod "864d1c0b-3c85-4472-9d16-c8d5c574e02a" (UID: "864d1c0b-3c85-4472-9d16-c8d5c574e02a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.740436 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/652260eb-911a-4b92-8cea-73c9e8f156c4-client-ca" (OuterVolumeSpecName: "client-ca") pod "652260eb-911a-4b92-8cea-73c9e8f156c4" (UID: "652260eb-911a-4b92-8cea-73c9e8f156c4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.740591 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/652260eb-911a-4b92-8cea-73c9e8f156c4-config" (OuterVolumeSpecName: "config") pod "652260eb-911a-4b92-8cea-73c9e8f156c4" (UID: "652260eb-911a-4b92-8cea-73c9e8f156c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.740613 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-config" (OuterVolumeSpecName: "config") pod "864d1c0b-3c85-4472-9d16-c8d5c574e02a" (UID: "864d1c0b-3c85-4472-9d16-c8d5c574e02a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.744937 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/652260eb-911a-4b92-8cea-73c9e8f156c4-kube-api-access-7mvjv" (OuterVolumeSpecName: "kube-api-access-7mvjv") pod "652260eb-911a-4b92-8cea-73c9e8f156c4" (UID: "652260eb-911a-4b92-8cea-73c9e8f156c4"). InnerVolumeSpecName "kube-api-access-7mvjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.745070 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652260eb-911a-4b92-8cea-73c9e8f156c4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "652260eb-911a-4b92-8cea-73c9e8f156c4" (UID: "652260eb-911a-4b92-8cea-73c9e8f156c4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.745156 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/864d1c0b-3c85-4472-9d16-c8d5c574e02a-kube-api-access-zxbv4" (OuterVolumeSpecName: "kube-api-access-zxbv4") pod "864d1c0b-3c85-4472-9d16-c8d5c574e02a" (UID: "864d1c0b-3c85-4472-9d16-c8d5c574e02a"). InnerVolumeSpecName "kube-api-access-zxbv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.745213 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/864d1c0b-3c85-4472-9d16-c8d5c574e02a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "864d1c0b-3c85-4472-9d16-c8d5c574e02a" (UID: "864d1c0b-3c85-4472-9d16-c8d5c574e02a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.840682 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/864d1c0b-3c85-4472-9d16-c8d5c574e02a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.840711 4594 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.840723 4594 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/652260eb-911a-4b92-8cea-73c9e8f156c4-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.840731 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/652260eb-911a-4b92-8cea-73c9e8f156c4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.840740 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.840748 4594 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/864d1c0b-3c85-4472-9d16-c8d5c574e02a-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.840759 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxbv4\" (UniqueName: \"kubernetes.io/projected/864d1c0b-3c85-4472-9d16-c8d5c574e02a-kube-api-access-zxbv4\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.840768 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mvjv\" (UniqueName: \"kubernetes.io/projected/652260eb-911a-4b92-8cea-73c9e8f156c4-kube-api-access-7mvjv\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:10 crc kubenswrapper[4594]: I1129 05:33:10.840775 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/652260eb-911a-4b92-8cea-73c9e8f156c4-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.472885 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" event={"ID":"652260eb-911a-4b92-8cea-73c9e8f156c4","Type":"ContainerDied","Data":"1fc0fb7ba1ed2dec40f5f72319fc2364bdf8c835465b1fd239fed6cdeb203b21"} Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.472941 4594 scope.go:117] "RemoveContainer" containerID="0e0e3257b5f508dae9810188a0ad1ca6db901a70732dcc72b62da7e20d1eb528" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.473044 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.475088 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" event={"ID":"864d1c0b-3c85-4472-9d16-c8d5c574e02a","Type":"ContainerDied","Data":"f7eccd1754457a8922936f3bbc837f9ea4d0400d7db11c4dc6f8ab09b7feedcd"} Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.475136 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-scxzk" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.487666 4594 scope.go:117] "RemoveContainer" containerID="4ef591495522298ee55bf70323d52dd483bbe98056d92534ed3a10c74ff2f22d" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.513080 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6"] Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.519180 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zvvn6"] Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.521883 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-scxzk"] Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.524144 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-scxzk"] Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.638153 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-579775c947-hjvbf"] Nov 29 05:33:11 crc kubenswrapper[4594]: E1129 05:33:11.638461 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="652260eb-911a-4b92-8cea-73c9e8f156c4" containerName="route-controller-manager" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.638482 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="652260eb-911a-4b92-8cea-73c9e8f156c4" containerName="route-controller-manager" Nov 29 05:33:11 crc kubenswrapper[4594]: E1129 05:33:11.638496 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864d1c0b-3c85-4472-9d16-c8d5c574e02a" containerName="controller-manager" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.638505 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="864d1c0b-3c85-4472-9d16-c8d5c574e02a" containerName="controller-manager" Nov 29 05:33:11 crc kubenswrapper[4594]: E1129 05:33:11.638520 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" containerName="installer" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.638526 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" containerName="installer" Nov 29 05:33:11 crc kubenswrapper[4594]: E1129 05:33:11.638535 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.638542 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.638669 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.638687 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="864d1c0b-3c85-4472-9d16-c8d5c574e02a" containerName="controller-manager" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.638695 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="652260eb-911a-4b92-8cea-73c9e8f156c4" containerName="route-controller-manager" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.638703 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="031260c4-ed19-4221-b30f-d03a4abb131f" containerName="installer" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.639218 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.640775 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.640803 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf"] Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.641403 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.641487 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.641500 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.641583 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.641956 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.644051 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.644414 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.644475 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.645342 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.645711 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.645848 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.647847 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.648890 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.649940 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/393366d0-af17-4143-ae04-d42c84a300ec-client-ca\") pod \"route-controller-manager-86bcf5fdcc-56hsf\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.650030 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-client-ca\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.650195 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqqz6\" (UniqueName: \"kubernetes.io/projected/5a31ebcd-de10-478c-9c35-ed4e22084801-kube-api-access-lqqz6\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.650271 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqdjx\" (UniqueName: \"kubernetes.io/projected/393366d0-af17-4143-ae04-d42c84a300ec-kube-api-access-nqdjx\") pod \"route-controller-manager-86bcf5fdcc-56hsf\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.650300 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/393366d0-af17-4143-ae04-d42c84a300ec-serving-cert\") pod \"route-controller-manager-86bcf5fdcc-56hsf\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.650327 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-proxy-ca-bundles\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.650350 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a31ebcd-de10-478c-9c35-ed4e22084801-serving-cert\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.650382 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/393366d0-af17-4143-ae04-d42c84a300ec-config\") pod \"route-controller-manager-86bcf5fdcc-56hsf\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.650433 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-config\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.650526 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-579775c947-hjvbf"] Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.652885 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf"] Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.751894 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/393366d0-af17-4143-ae04-d42c84a300ec-serving-cert\") pod \"route-controller-manager-86bcf5fdcc-56hsf\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.751936 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqdjx\" (UniqueName: \"kubernetes.io/projected/393366d0-af17-4143-ae04-d42c84a300ec-kube-api-access-nqdjx\") pod \"route-controller-manager-86bcf5fdcc-56hsf\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.751966 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-proxy-ca-bundles\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.751986 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a31ebcd-de10-478c-9c35-ed4e22084801-serving-cert\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.752006 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/393366d0-af17-4143-ae04-d42c84a300ec-config\") pod \"route-controller-manager-86bcf5fdcc-56hsf\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.752049 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-config\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.752095 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/393366d0-af17-4143-ae04-d42c84a300ec-client-ca\") pod \"route-controller-manager-86bcf5fdcc-56hsf\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.752133 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-client-ca\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.752183 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqqz6\" (UniqueName: \"kubernetes.io/projected/5a31ebcd-de10-478c-9c35-ed4e22084801-kube-api-access-lqqz6\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.753183 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/393366d0-af17-4143-ae04-d42c84a300ec-client-ca\") pod \"route-controller-manager-86bcf5fdcc-56hsf\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.753423 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/393366d0-af17-4143-ae04-d42c84a300ec-config\") pod \"route-controller-manager-86bcf5fdcc-56hsf\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.753537 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-proxy-ca-bundles\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.754014 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-client-ca\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.754868 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-config\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.757398 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a31ebcd-de10-478c-9c35-ed4e22084801-serving-cert\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.758007 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/393366d0-af17-4143-ae04-d42c84a300ec-serving-cert\") pod \"route-controller-manager-86bcf5fdcc-56hsf\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.765186 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqdjx\" (UniqueName: \"kubernetes.io/projected/393366d0-af17-4143-ae04-d42c84a300ec-kube-api-access-nqdjx\") pod \"route-controller-manager-86bcf5fdcc-56hsf\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.766637 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqqz6\" (UniqueName: \"kubernetes.io/projected/5a31ebcd-de10-478c-9c35-ed4e22084801-kube-api-access-lqqz6\") pod \"controller-manager-579775c947-hjvbf\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.956562 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:11 crc kubenswrapper[4594]: I1129 05:33:11.961603 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.089492 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="652260eb-911a-4b92-8cea-73c9e8f156c4" path="/var/lib/kubelet/pods/652260eb-911a-4b92-8cea-73c9e8f156c4/volumes" Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.090246 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="864d1c0b-3c85-4472-9d16-c8d5c574e02a" path="/var/lib/kubelet/pods/864d1c0b-3c85-4472-9d16-c8d5c574e02a/volumes" Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.322361 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf"] Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.353621 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-579775c947-hjvbf"] Nov 29 05:33:12 crc kubenswrapper[4594]: W1129 05:33:12.373407 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a31ebcd_de10_478c_9c35_ed4e22084801.slice/crio-bb0438e217cb22af5101d87ada9b4a759a52ff05dc41c7e31df194ada75d1b6b WatchSource:0}: Error finding container bb0438e217cb22af5101d87ada9b4a759a52ff05dc41c7e31df194ada75d1b6b: Status 404 returned error can't find the container with id bb0438e217cb22af5101d87ada9b4a759a52ff05dc41c7e31df194ada75d1b6b Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.484180 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" event={"ID":"393366d0-af17-4143-ae04-d42c84a300ec","Type":"ContainerStarted","Data":"d3eba8a4fb2e63f7e5285a35a9a5bb95af9194f4b5d888fa7202524de0b0501c"} Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.484549 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.484563 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" event={"ID":"393366d0-af17-4143-ae04-d42c84a300ec","Type":"ContainerStarted","Data":"b4226d0503814e8008ad16843c7f8fa94d86e115cd73379ba87d317ceab197ca"} Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.485937 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" event={"ID":"5a31ebcd-de10-478c-9c35-ed4e22084801","Type":"ContainerStarted","Data":"112376069ebfe690ffae1a1b2db136f37475b824e2dcc04daddc066062409f81"} Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.485968 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" event={"ID":"5a31ebcd-de10-478c-9c35-ed4e22084801","Type":"ContainerStarted","Data":"bb0438e217cb22af5101d87ada9b4a759a52ff05dc41c7e31df194ada75d1b6b"} Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.486158 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.487285 4594 patch_prober.go:28] interesting pod/controller-manager-579775c947-hjvbf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.487344 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" podUID="5a31ebcd-de10-478c-9c35-ed4e22084801" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.500946 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" podStartSLOduration=2.50093114 podStartE2EDuration="2.50093114s" podCreationTimestamp="2025-11-29 05:33:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:33:12.498010538 +0000 UTC m=+316.738519758" watchObservedRunningTime="2025-11-29 05:33:12.50093114 +0000 UTC m=+316.741440359" Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.510487 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" podStartSLOduration=2.510474645 podStartE2EDuration="2.510474645s" podCreationTimestamp="2025-11-29 05:33:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:33:12.509598558 +0000 UTC m=+316.750107778" watchObservedRunningTime="2025-11-29 05:33:12.510474645 +0000 UTC m=+316.750983865" Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.555074 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-579775c947-hjvbf"] Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.559606 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf"] Nov 29 05:33:12 crc kubenswrapper[4594]: I1129 05:33:12.732114 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:13 crc kubenswrapper[4594]: I1129 05:33:13.503454 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.501231 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" podUID="5a31ebcd-de10-478c-9c35-ed4e22084801" containerName="controller-manager" containerID="cri-o://112376069ebfe690ffae1a1b2db136f37475b824e2dcc04daddc066062409f81" gracePeriod=30 Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.501636 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" podUID="393366d0-af17-4143-ae04-d42c84a300ec" containerName="route-controller-manager" containerID="cri-o://d3eba8a4fb2e63f7e5285a35a9a5bb95af9194f4b5d888fa7202524de0b0501c" gracePeriod=30 Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.833633 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.838105 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.860223 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64f4f8899d-5wl7j"] Nov 29 05:33:14 crc kubenswrapper[4594]: E1129 05:33:14.860445 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393366d0-af17-4143-ae04-d42c84a300ec" containerName="route-controller-manager" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.860460 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="393366d0-af17-4143-ae04-d42c84a300ec" containerName="route-controller-manager" Nov 29 05:33:14 crc kubenswrapper[4594]: E1129 05:33:14.860475 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a31ebcd-de10-478c-9c35-ed4e22084801" containerName="controller-manager" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.860533 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a31ebcd-de10-478c-9c35-ed4e22084801" containerName="controller-manager" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.860684 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="393366d0-af17-4143-ae04-d42c84a300ec" containerName="route-controller-manager" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.860696 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a31ebcd-de10-478c-9c35-ed4e22084801" containerName="controller-manager" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.861018 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.873409 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64f4f8899d-5wl7j"] Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889267 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/393366d0-af17-4143-ae04-d42c84a300ec-serving-cert\") pod \"393366d0-af17-4143-ae04-d42c84a300ec\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889336 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/393366d0-af17-4143-ae04-d42c84a300ec-config\") pod \"393366d0-af17-4143-ae04-d42c84a300ec\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889389 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a31ebcd-de10-478c-9c35-ed4e22084801-serving-cert\") pod \"5a31ebcd-de10-478c-9c35-ed4e22084801\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889410 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-config\") pod \"5a31ebcd-de10-478c-9c35-ed4e22084801\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889442 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-proxy-ca-bundles\") pod \"5a31ebcd-de10-478c-9c35-ed4e22084801\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889478 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/393366d0-af17-4143-ae04-d42c84a300ec-client-ca\") pod \"393366d0-af17-4143-ae04-d42c84a300ec\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889495 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-client-ca\") pod \"5a31ebcd-de10-478c-9c35-ed4e22084801\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889511 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqdjx\" (UniqueName: \"kubernetes.io/projected/393366d0-af17-4143-ae04-d42c84a300ec-kube-api-access-nqdjx\") pod \"393366d0-af17-4143-ae04-d42c84a300ec\" (UID: \"393366d0-af17-4143-ae04-d42c84a300ec\") " Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889543 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqqz6\" (UniqueName: \"kubernetes.io/projected/5a31ebcd-de10-478c-9c35-ed4e22084801-kube-api-access-lqqz6\") pod \"5a31ebcd-de10-478c-9c35-ed4e22084801\" (UID: \"5a31ebcd-de10-478c-9c35-ed4e22084801\") " Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889657 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-proxy-ca-bundles\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889691 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkn7z\" (UniqueName: \"kubernetes.io/projected/66e4aafc-b2a5-4f3f-926d-818c48d99f10-kube-api-access-rkn7z\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889716 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-config\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889734 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-client-ca\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.889789 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66e4aafc-b2a5-4f3f-926d-818c48d99f10-serving-cert\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.890492 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-client-ca" (OuterVolumeSpecName: "client-ca") pod "5a31ebcd-de10-478c-9c35-ed4e22084801" (UID: "5a31ebcd-de10-478c-9c35-ed4e22084801"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.890503 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5a31ebcd-de10-478c-9c35-ed4e22084801" (UID: "5a31ebcd-de10-478c-9c35-ed4e22084801"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.890661 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-config" (OuterVolumeSpecName: "config") pod "5a31ebcd-de10-478c-9c35-ed4e22084801" (UID: "5a31ebcd-de10-478c-9c35-ed4e22084801"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.890670 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/393366d0-af17-4143-ae04-d42c84a300ec-client-ca" (OuterVolumeSpecName: "client-ca") pod "393366d0-af17-4143-ae04-d42c84a300ec" (UID: "393366d0-af17-4143-ae04-d42c84a300ec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.891643 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/393366d0-af17-4143-ae04-d42c84a300ec-config" (OuterVolumeSpecName: "config") pod "393366d0-af17-4143-ae04-d42c84a300ec" (UID: "393366d0-af17-4143-ae04-d42c84a300ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.894310 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a31ebcd-de10-478c-9c35-ed4e22084801-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5a31ebcd-de10-478c-9c35-ed4e22084801" (UID: "5a31ebcd-de10-478c-9c35-ed4e22084801"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.895468 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a31ebcd-de10-478c-9c35-ed4e22084801-kube-api-access-lqqz6" (OuterVolumeSpecName: "kube-api-access-lqqz6") pod "5a31ebcd-de10-478c-9c35-ed4e22084801" (UID: "5a31ebcd-de10-478c-9c35-ed4e22084801"). InnerVolumeSpecName "kube-api-access-lqqz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.896412 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/393366d0-af17-4143-ae04-d42c84a300ec-kube-api-access-nqdjx" (OuterVolumeSpecName: "kube-api-access-nqdjx") pod "393366d0-af17-4143-ae04-d42c84a300ec" (UID: "393366d0-af17-4143-ae04-d42c84a300ec"). InnerVolumeSpecName "kube-api-access-nqdjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.905179 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/393366d0-af17-4143-ae04-d42c84a300ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "393366d0-af17-4143-ae04-d42c84a300ec" (UID: "393366d0-af17-4143-ae04-d42c84a300ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990203 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-config\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990289 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-client-ca\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990349 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66e4aafc-b2a5-4f3f-926d-818c48d99f10-serving-cert\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990378 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-proxy-ca-bundles\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990406 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkn7z\" (UniqueName: \"kubernetes.io/projected/66e4aafc-b2a5-4f3f-926d-818c48d99f10-kube-api-access-rkn7z\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990442 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqqz6\" (UniqueName: \"kubernetes.io/projected/5a31ebcd-de10-478c-9c35-ed4e22084801-kube-api-access-lqqz6\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990454 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/393366d0-af17-4143-ae04-d42c84a300ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990463 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/393366d0-af17-4143-ae04-d42c84a300ec-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990474 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a31ebcd-de10-478c-9c35-ed4e22084801-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990482 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990492 4594 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990500 4594 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/393366d0-af17-4143-ae04-d42c84a300ec-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990510 4594 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a31ebcd-de10-478c-9c35-ed4e22084801-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.990518 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqdjx\" (UniqueName: \"kubernetes.io/projected/393366d0-af17-4143-ae04-d42c84a300ec-kube-api-access-nqdjx\") on node \"crc\" DevicePath \"\"" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.991677 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-config\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.991842 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-client-ca\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.991850 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-proxy-ca-bundles\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:14 crc kubenswrapper[4594]: I1129 05:33:14.995302 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66e4aafc-b2a5-4f3f-926d-818c48d99f10-serving-cert\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.005358 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkn7z\" (UniqueName: \"kubernetes.io/projected/66e4aafc-b2a5-4f3f-926d-818c48d99f10-kube-api-access-rkn7z\") pod \"controller-manager-64f4f8899d-5wl7j\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.173315 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.509758 4594 generic.go:334] "Generic (PLEG): container finished" podID="393366d0-af17-4143-ae04-d42c84a300ec" containerID="d3eba8a4fb2e63f7e5285a35a9a5bb95af9194f4b5d888fa7202524de0b0501c" exitCode=0 Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.509846 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" event={"ID":"393366d0-af17-4143-ae04-d42c84a300ec","Type":"ContainerDied","Data":"d3eba8a4fb2e63f7e5285a35a9a5bb95af9194f4b5d888fa7202524de0b0501c"} Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.509883 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.509926 4594 scope.go:117] "RemoveContainer" containerID="d3eba8a4fb2e63f7e5285a35a9a5bb95af9194f4b5d888fa7202524de0b0501c" Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.509910 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf" event={"ID":"393366d0-af17-4143-ae04-d42c84a300ec","Type":"ContainerDied","Data":"b4226d0503814e8008ad16843c7f8fa94d86e115cd73379ba87d317ceab197ca"} Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.512873 4594 generic.go:334] "Generic (PLEG): container finished" podID="5a31ebcd-de10-478c-9c35-ed4e22084801" containerID="112376069ebfe690ffae1a1b2db136f37475b824e2dcc04daddc066062409f81" exitCode=0 Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.512915 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" event={"ID":"5a31ebcd-de10-478c-9c35-ed4e22084801","Type":"ContainerDied","Data":"112376069ebfe690ffae1a1b2db136f37475b824e2dcc04daddc066062409f81"} Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.512946 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" event={"ID":"5a31ebcd-de10-478c-9c35-ed4e22084801","Type":"ContainerDied","Data":"bb0438e217cb22af5101d87ada9b4a759a52ff05dc41c7e31df194ada75d1b6b"} Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.513007 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-579775c947-hjvbf" Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.527667 4594 scope.go:117] "RemoveContainer" containerID="d3eba8a4fb2e63f7e5285a35a9a5bb95af9194f4b5d888fa7202524de0b0501c" Nov 29 05:33:15 crc kubenswrapper[4594]: E1129 05:33:15.528040 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3eba8a4fb2e63f7e5285a35a9a5bb95af9194f4b5d888fa7202524de0b0501c\": container with ID starting with d3eba8a4fb2e63f7e5285a35a9a5bb95af9194f4b5d888fa7202524de0b0501c not found: ID does not exist" containerID="d3eba8a4fb2e63f7e5285a35a9a5bb95af9194f4b5d888fa7202524de0b0501c" Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.528082 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3eba8a4fb2e63f7e5285a35a9a5bb95af9194f4b5d888fa7202524de0b0501c"} err="failed to get container status \"d3eba8a4fb2e63f7e5285a35a9a5bb95af9194f4b5d888fa7202524de0b0501c\": rpc error: code = NotFound desc = could not find container \"d3eba8a4fb2e63f7e5285a35a9a5bb95af9194f4b5d888fa7202524de0b0501c\": container with ID starting with d3eba8a4fb2e63f7e5285a35a9a5bb95af9194f4b5d888fa7202524de0b0501c not found: ID does not exist" Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.528106 4594 scope.go:117] "RemoveContainer" containerID="112376069ebfe690ffae1a1b2db136f37475b824e2dcc04daddc066062409f81" Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.540069 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf"] Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.542867 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64f4f8899d-5wl7j"] Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.546765 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86bcf5fdcc-56hsf"] Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.548788 4594 scope.go:117] "RemoveContainer" containerID="112376069ebfe690ffae1a1b2db136f37475b824e2dcc04daddc066062409f81" Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.549016 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-579775c947-hjvbf"] Nov 29 05:33:15 crc kubenswrapper[4594]: E1129 05:33:15.549133 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"112376069ebfe690ffae1a1b2db136f37475b824e2dcc04daddc066062409f81\": container with ID starting with 112376069ebfe690ffae1a1b2db136f37475b824e2dcc04daddc066062409f81 not found: ID does not exist" containerID="112376069ebfe690ffae1a1b2db136f37475b824e2dcc04daddc066062409f81" Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.549196 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112376069ebfe690ffae1a1b2db136f37475b824e2dcc04daddc066062409f81"} err="failed to get container status \"112376069ebfe690ffae1a1b2db136f37475b824e2dcc04daddc066062409f81\": rpc error: code = NotFound desc = could not find container \"112376069ebfe690ffae1a1b2db136f37475b824e2dcc04daddc066062409f81\": container with ID starting with 112376069ebfe690ffae1a1b2db136f37475b824e2dcc04daddc066062409f81 not found: ID does not exist" Nov 29 05:33:15 crc kubenswrapper[4594]: I1129 05:33:15.551596 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-579775c947-hjvbf"] Nov 29 05:33:15 crc kubenswrapper[4594]: W1129 05:33:15.553104 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66e4aafc_b2a5_4f3f_926d_818c48d99f10.slice/crio-c65a85798c5df935ba72148adfb4659dd4677af3100414888f311661011b8509 WatchSource:0}: Error finding container c65a85798c5df935ba72148adfb4659dd4677af3100414888f311661011b8509: Status 404 returned error can't find the container with id c65a85798c5df935ba72148adfb4659dd4677af3100414888f311661011b8509 Nov 29 05:33:16 crc kubenswrapper[4594]: I1129 05:33:16.090030 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="393366d0-af17-4143-ae04-d42c84a300ec" path="/var/lib/kubelet/pods/393366d0-af17-4143-ae04-d42c84a300ec/volumes" Nov 29 05:33:16 crc kubenswrapper[4594]: I1129 05:33:16.090980 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a31ebcd-de10-478c-9c35-ed4e22084801" path="/var/lib/kubelet/pods/5a31ebcd-de10-478c-9c35-ed4e22084801/volumes" Nov 29 05:33:16 crc kubenswrapper[4594]: I1129 05:33:16.522409 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" event={"ID":"66e4aafc-b2a5-4f3f-926d-818c48d99f10","Type":"ContainerStarted","Data":"07d22b96555173e9d3a3c7215c31f5003162e03098bdf696f8a29fb877b229c9"} Nov 29 05:33:16 crc kubenswrapper[4594]: I1129 05:33:16.522631 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:16 crc kubenswrapper[4594]: I1129 05:33:16.522696 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" event={"ID":"66e4aafc-b2a5-4f3f-926d-818c48d99f10","Type":"ContainerStarted","Data":"c65a85798c5df935ba72148adfb4659dd4677af3100414888f311661011b8509"} Nov 29 05:33:16 crc kubenswrapper[4594]: I1129 05:33:16.527318 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:33:16 crc kubenswrapper[4594]: I1129 05:33:16.539024 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" podStartSLOduration=4.538985872 podStartE2EDuration="4.538985872s" podCreationTimestamp="2025-11-29 05:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:33:16.535692771 +0000 UTC m=+320.776201990" watchObservedRunningTime="2025-11-29 05:33:16.538985872 +0000 UTC m=+320.779495093" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.643350 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96"] Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.644593 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.646656 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.646975 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.651865 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.652026 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.652589 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.652653 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.654595 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96"] Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.820419 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-serving-cert\") pod \"route-controller-manager-77bbdf4988-r6g96\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.820845 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvfwf\" (UniqueName: \"kubernetes.io/projected/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-kube-api-access-nvfwf\") pod \"route-controller-manager-77bbdf4988-r6g96\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.820882 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-config\") pod \"route-controller-manager-77bbdf4988-r6g96\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.820911 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-client-ca\") pod \"route-controller-manager-77bbdf4988-r6g96\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.922400 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvfwf\" (UniqueName: \"kubernetes.io/projected/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-kube-api-access-nvfwf\") pod \"route-controller-manager-77bbdf4988-r6g96\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.922482 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-config\") pod \"route-controller-manager-77bbdf4988-r6g96\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.922526 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-client-ca\") pod \"route-controller-manager-77bbdf4988-r6g96\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.922585 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-serving-cert\") pod \"route-controller-manager-77bbdf4988-r6g96\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.923372 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-client-ca\") pod \"route-controller-manager-77bbdf4988-r6g96\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.923612 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-config\") pod \"route-controller-manager-77bbdf4988-r6g96\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.928094 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-serving-cert\") pod \"route-controller-manager-77bbdf4988-r6g96\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.936708 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvfwf\" (UniqueName: \"kubernetes.io/projected/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-kube-api-access-nvfwf\") pod \"route-controller-manager-77bbdf4988-r6g96\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:17 crc kubenswrapper[4594]: I1129 05:33:17.959726 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:18 crc kubenswrapper[4594]: I1129 05:33:18.355421 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96"] Nov 29 05:33:18 crc kubenswrapper[4594]: W1129 05:33:18.356328 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf74e0bd8_e2f7_4ce6_9fb0_538d0b45a5c8.slice/crio-ac4db64c7186f85cf7531c72c973d07a6b0b6f6d350beed054ec6d32226f1edd WatchSource:0}: Error finding container ac4db64c7186f85cf7531c72c973d07a6b0b6f6d350beed054ec6d32226f1edd: Status 404 returned error can't find the container with id ac4db64c7186f85cf7531c72c973d07a6b0b6f6d350beed054ec6d32226f1edd Nov 29 05:33:18 crc kubenswrapper[4594]: I1129 05:33:18.534314 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" event={"ID":"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8","Type":"ContainerStarted","Data":"04c26e20154ef4a2ba0bbc07603608e950df0923cbc874d0ca0f05ec82ac02b8"} Nov 29 05:33:18 crc kubenswrapper[4594]: I1129 05:33:18.534629 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:33:18 crc kubenswrapper[4594]: I1129 05:33:18.534644 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" event={"ID":"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8","Type":"ContainerStarted","Data":"ac4db64c7186f85cf7531c72c973d07a6b0b6f6d350beed054ec6d32226f1edd"} Nov 29 05:33:18 crc kubenswrapper[4594]: I1129 05:33:18.554023 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" podStartSLOduration=6.5540096949999995 podStartE2EDuration="6.554009695s" podCreationTimestamp="2025-11-29 05:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:33:18.551873037 +0000 UTC m=+322.792382257" watchObservedRunningTime="2025-11-29 05:33:18.554009695 +0000 UTC m=+322.794518906" Nov 29 05:33:18 crc kubenswrapper[4594]: I1129 05:33:18.717602 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.099450 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-h5bvf"] Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.100702 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.111061 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-h5bvf"] Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.272656 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-registry-tls\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.272835 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-bound-sa-token\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.272928 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-trusted-ca\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.273003 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.273040 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzx2w\" (UniqueName: \"kubernetes.io/projected/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-kube-api-access-xzx2w\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.273084 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.273135 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.273172 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-registry-certificates\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.311505 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.374349 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-trusted-ca\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.374404 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.375474 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzx2w\" (UniqueName: \"kubernetes.io/projected/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-kube-api-access-xzx2w\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.375521 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.375565 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-registry-certificates\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.375599 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-registry-tls\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.375672 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-bound-sa-token\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.376354 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.376659 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-trusted-ca\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.377175 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-registry-certificates\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.380655 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.385765 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-registry-tls\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.402876 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzx2w\" (UniqueName: \"kubernetes.io/projected/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-kube-api-access-xzx2w\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.403486 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cae78cb-df6e-4fae-bc0d-32f2ffaf605f-bound-sa-token\") pod \"image-registry-66df7c8f76-h5bvf\" (UID: \"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f\") " pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.412418 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:06 crc kubenswrapper[4594]: I1129 05:34:06.774862 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-h5bvf"] Nov 29 05:34:07 crc kubenswrapper[4594]: I1129 05:34:07.782310 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" event={"ID":"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f","Type":"ContainerStarted","Data":"164eccfe1cf77920d8247d67558313bfb06ce057cf3c88e607eebe08ba83b127"} Nov 29 05:34:07 crc kubenswrapper[4594]: I1129 05:34:07.782736 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:07 crc kubenswrapper[4594]: I1129 05:34:07.782757 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" event={"ID":"2cae78cb-df6e-4fae-bc0d-32f2ffaf605f","Type":"ContainerStarted","Data":"d57354bb22fd04eaf8336cc9aac2eaefadb260a6072ecb2d49137b7535eeb472"} Nov 29 05:34:07 crc kubenswrapper[4594]: I1129 05:34:07.799748 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" podStartSLOduration=1.799725639 podStartE2EDuration="1.799725639s" podCreationTimestamp="2025-11-29 05:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:34:07.796719202 +0000 UTC m=+372.037228422" watchObservedRunningTime="2025-11-29 05:34:07.799725639 +0000 UTC m=+372.040234850" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.267214 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64f4f8899d-5wl7j"] Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.267776 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" podUID="66e4aafc-b2a5-4f3f-926d-818c48d99f10" containerName="controller-manager" containerID="cri-o://07d22b96555173e9d3a3c7215c31f5003162e03098bdf696f8a29fb877b229c9" gracePeriod=30 Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.277174 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96"] Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.278356 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" podUID="f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8" containerName="route-controller-manager" containerID="cri-o://04c26e20154ef4a2ba0bbc07603608e950df0923cbc874d0ca0f05ec82ac02b8" gracePeriod=30 Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.644867 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.648915 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.737481 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-config\") pod \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.737548 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-proxy-ca-bundles\") pod \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.737580 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkn7z\" (UniqueName: \"kubernetes.io/projected/66e4aafc-b2a5-4f3f-926d-818c48d99f10-kube-api-access-rkn7z\") pod \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.737633 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-client-ca\") pod \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.737695 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-serving-cert\") pod \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.737714 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-config\") pod \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.737741 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvfwf\" (UniqueName: \"kubernetes.io/projected/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-kube-api-access-nvfwf\") pod \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\" (UID: \"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8\") " Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.737772 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66e4aafc-b2a5-4f3f-926d-818c48d99f10-serving-cert\") pod \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.737798 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-client-ca\") pod \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\" (UID: \"66e4aafc-b2a5-4f3f-926d-818c48d99f10\") " Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.738380 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "66e4aafc-b2a5-4f3f-926d-818c48d99f10" (UID: "66e4aafc-b2a5-4f3f-926d-818c48d99f10"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.738469 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-client-ca" (OuterVolumeSpecName: "client-ca") pod "f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8" (UID: "f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.738481 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-config" (OuterVolumeSpecName: "config") pod "f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8" (UID: "f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.738947 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-client-ca" (OuterVolumeSpecName: "client-ca") pod "66e4aafc-b2a5-4f3f-926d-818c48d99f10" (UID: "66e4aafc-b2a5-4f3f-926d-818c48d99f10"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.739161 4594 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.739189 4594 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.739202 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.739175 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-config" (OuterVolumeSpecName: "config") pod "66e4aafc-b2a5-4f3f-926d-818c48d99f10" (UID: "66e4aafc-b2a5-4f3f-926d-818c48d99f10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.739210 4594 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.743220 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8" (UID: "f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.743458 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66e4aafc-b2a5-4f3f-926d-818c48d99f10-kube-api-access-rkn7z" (OuterVolumeSpecName: "kube-api-access-rkn7z") pod "66e4aafc-b2a5-4f3f-926d-818c48d99f10" (UID: "66e4aafc-b2a5-4f3f-926d-818c48d99f10"). InnerVolumeSpecName "kube-api-access-rkn7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.743686 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-kube-api-access-nvfwf" (OuterVolumeSpecName: "kube-api-access-nvfwf") pod "f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8" (UID: "f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8"). InnerVolumeSpecName "kube-api-access-nvfwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.743972 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66e4aafc-b2a5-4f3f-926d-818c48d99f10-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "66e4aafc-b2a5-4f3f-926d-818c48d99f10" (UID: "66e4aafc-b2a5-4f3f-926d-818c48d99f10"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.800123 4594 generic.go:334] "Generic (PLEG): container finished" podID="f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8" containerID="04c26e20154ef4a2ba0bbc07603608e950df0923cbc874d0ca0f05ec82ac02b8" exitCode=0 Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.800193 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.800226 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" event={"ID":"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8","Type":"ContainerDied","Data":"04c26e20154ef4a2ba0bbc07603608e950df0923cbc874d0ca0f05ec82ac02b8"} Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.800308 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96" event={"ID":"f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8","Type":"ContainerDied","Data":"ac4db64c7186f85cf7531c72c973d07a6b0b6f6d350beed054ec6d32226f1edd"} Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.800330 4594 scope.go:117] "RemoveContainer" containerID="04c26e20154ef4a2ba0bbc07603608e950df0923cbc874d0ca0f05ec82ac02b8" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.802411 4594 generic.go:334] "Generic (PLEG): container finished" podID="66e4aafc-b2a5-4f3f-926d-818c48d99f10" containerID="07d22b96555173e9d3a3c7215c31f5003162e03098bdf696f8a29fb877b229c9" exitCode=0 Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.802448 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" event={"ID":"66e4aafc-b2a5-4f3f-926d-818c48d99f10","Type":"ContainerDied","Data":"07d22b96555173e9d3a3c7215c31f5003162e03098bdf696f8a29fb877b229c9"} Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.802470 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.802478 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64f4f8899d-5wl7j" event={"ID":"66e4aafc-b2a5-4f3f-926d-818c48d99f10","Type":"ContainerDied","Data":"c65a85798c5df935ba72148adfb4659dd4677af3100414888f311661011b8509"} Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.817688 4594 scope.go:117] "RemoveContainer" containerID="04c26e20154ef4a2ba0bbc07603608e950df0923cbc874d0ca0f05ec82ac02b8" Nov 29 05:34:10 crc kubenswrapper[4594]: E1129 05:34:10.818155 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04c26e20154ef4a2ba0bbc07603608e950df0923cbc874d0ca0f05ec82ac02b8\": container with ID starting with 04c26e20154ef4a2ba0bbc07603608e950df0923cbc874d0ca0f05ec82ac02b8 not found: ID does not exist" containerID="04c26e20154ef4a2ba0bbc07603608e950df0923cbc874d0ca0f05ec82ac02b8" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.818309 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c26e20154ef4a2ba0bbc07603608e950df0923cbc874d0ca0f05ec82ac02b8"} err="failed to get container status \"04c26e20154ef4a2ba0bbc07603608e950df0923cbc874d0ca0f05ec82ac02b8\": rpc error: code = NotFound desc = could not find container \"04c26e20154ef4a2ba0bbc07603608e950df0923cbc874d0ca0f05ec82ac02b8\": container with ID starting with 04c26e20154ef4a2ba0bbc07603608e950df0923cbc874d0ca0f05ec82ac02b8 not found: ID does not exist" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.820383 4594 scope.go:117] "RemoveContainer" containerID="07d22b96555173e9d3a3c7215c31f5003162e03098bdf696f8a29fb877b229c9" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.827213 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96"] Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.833740 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77bbdf4988-r6g96"] Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.838537 4594 scope.go:117] "RemoveContainer" containerID="07d22b96555173e9d3a3c7215c31f5003162e03098bdf696f8a29fb877b229c9" Nov 29 05:34:10 crc kubenswrapper[4594]: E1129 05:34:10.838872 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07d22b96555173e9d3a3c7215c31f5003162e03098bdf696f8a29fb877b229c9\": container with ID starting with 07d22b96555173e9d3a3c7215c31f5003162e03098bdf696f8a29fb877b229c9 not found: ID does not exist" containerID="07d22b96555173e9d3a3c7215c31f5003162e03098bdf696f8a29fb877b229c9" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.838904 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07d22b96555173e9d3a3c7215c31f5003162e03098bdf696f8a29fb877b229c9"} err="failed to get container status \"07d22b96555173e9d3a3c7215c31f5003162e03098bdf696f8a29fb877b229c9\": rpc error: code = NotFound desc = could not find container \"07d22b96555173e9d3a3c7215c31f5003162e03098bdf696f8a29fb877b229c9\": container with ID starting with 07d22b96555173e9d3a3c7215c31f5003162e03098bdf696f8a29fb877b229c9 not found: ID does not exist" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.840228 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64f4f8899d-5wl7j"] Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.840664 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66e4aafc-b2a5-4f3f-926d-818c48d99f10-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.840701 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkn7z\" (UniqueName: \"kubernetes.io/projected/66e4aafc-b2a5-4f3f-926d-818c48d99f10-kube-api-access-rkn7z\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.840717 4594 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.840728 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66e4aafc-b2a5-4f3f-926d-818c48d99f10-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.840738 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvfwf\" (UniqueName: \"kubernetes.io/projected/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8-kube-api-access-nvfwf\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:10 crc kubenswrapper[4594]: I1129 05:34:10.844759 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64f4f8899d-5wl7j"] Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.674869 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn"] Nov 29 05:34:11 crc kubenswrapper[4594]: E1129 05:34:11.675347 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8" containerName="route-controller-manager" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.675369 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8" containerName="route-controller-manager" Nov 29 05:34:11 crc kubenswrapper[4594]: E1129 05:34:11.675390 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e4aafc-b2a5-4f3f-926d-818c48d99f10" containerName="controller-manager" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.675401 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e4aafc-b2a5-4f3f-926d-818c48d99f10" containerName="controller-manager" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.675577 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8" containerName="route-controller-manager" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.675599 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e4aafc-b2a5-4f3f-926d-818c48d99f10" containerName="controller-manager" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.676381 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.678365 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.678581 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.679351 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.679398 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.681426 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.681453 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.683890 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cf4c4949d-dzvnd"] Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.684468 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.686984 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.687395 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.687414 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.687737 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cf4c4949d-dzvnd"] Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.687748 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.688805 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.688809 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.689984 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn"] Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.691546 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.851800 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b681576-9448-4067-86af-4845ce487034-config\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.852038 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b681576-9448-4067-86af-4845ce487034-proxy-ca-bundles\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.852166 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26f72ebb-236d-48df-8a92-a3c8a81ad5ac-client-ca\") pod \"route-controller-manager-6b5c45c56-hdzhn\" (UID: \"26f72ebb-236d-48df-8a92-a3c8a81ad5ac\") " pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.852280 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b681576-9448-4067-86af-4845ce487034-client-ca\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.852363 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f72ebb-236d-48df-8a92-a3c8a81ad5ac-serving-cert\") pod \"route-controller-manager-6b5c45c56-hdzhn\" (UID: \"26f72ebb-236d-48df-8a92-a3c8a81ad5ac\") " pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.852468 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnkgs\" (UniqueName: \"kubernetes.io/projected/26f72ebb-236d-48df-8a92-a3c8a81ad5ac-kube-api-access-pnkgs\") pod \"route-controller-manager-6b5c45c56-hdzhn\" (UID: \"26f72ebb-236d-48df-8a92-a3c8a81ad5ac\") " pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.852548 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnkkd\" (UniqueName: \"kubernetes.io/projected/1b681576-9448-4067-86af-4845ce487034-kube-api-access-cnkkd\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.852622 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f72ebb-236d-48df-8a92-a3c8a81ad5ac-config\") pod \"route-controller-manager-6b5c45c56-hdzhn\" (UID: \"26f72ebb-236d-48df-8a92-a3c8a81ad5ac\") " pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.852706 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b681576-9448-4067-86af-4845ce487034-serving-cert\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.953818 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b681576-9448-4067-86af-4845ce487034-serving-cert\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.954560 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b681576-9448-4067-86af-4845ce487034-config\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.954715 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b681576-9448-4067-86af-4845ce487034-proxy-ca-bundles\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.954809 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26f72ebb-236d-48df-8a92-a3c8a81ad5ac-client-ca\") pod \"route-controller-manager-6b5c45c56-hdzhn\" (UID: \"26f72ebb-236d-48df-8a92-a3c8a81ad5ac\") " pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.954949 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b681576-9448-4067-86af-4845ce487034-client-ca\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.955045 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f72ebb-236d-48df-8a92-a3c8a81ad5ac-serving-cert\") pod \"route-controller-manager-6b5c45c56-hdzhn\" (UID: \"26f72ebb-236d-48df-8a92-a3c8a81ad5ac\") " pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.955133 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnkgs\" (UniqueName: \"kubernetes.io/projected/26f72ebb-236d-48df-8a92-a3c8a81ad5ac-kube-api-access-pnkgs\") pod \"route-controller-manager-6b5c45c56-hdzhn\" (UID: \"26f72ebb-236d-48df-8a92-a3c8a81ad5ac\") " pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.955211 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnkkd\" (UniqueName: \"kubernetes.io/projected/1b681576-9448-4067-86af-4845ce487034-kube-api-access-cnkkd\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.955344 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f72ebb-236d-48df-8a92-a3c8a81ad5ac-config\") pod \"route-controller-manager-6b5c45c56-hdzhn\" (UID: \"26f72ebb-236d-48df-8a92-a3c8a81ad5ac\") " pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.956731 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26f72ebb-236d-48df-8a92-a3c8a81ad5ac-client-ca\") pod \"route-controller-manager-6b5c45c56-hdzhn\" (UID: \"26f72ebb-236d-48df-8a92-a3c8a81ad5ac\") " pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.956884 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b681576-9448-4067-86af-4845ce487034-client-ca\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.957157 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f72ebb-236d-48df-8a92-a3c8a81ad5ac-config\") pod \"route-controller-manager-6b5c45c56-hdzhn\" (UID: \"26f72ebb-236d-48df-8a92-a3c8a81ad5ac\") " pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.957193 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b681576-9448-4067-86af-4845ce487034-proxy-ca-bundles\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.958102 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b681576-9448-4067-86af-4845ce487034-config\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.960243 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f72ebb-236d-48df-8a92-a3c8a81ad5ac-serving-cert\") pod \"route-controller-manager-6b5c45c56-hdzhn\" (UID: \"26f72ebb-236d-48df-8a92-a3c8a81ad5ac\") " pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.961739 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b681576-9448-4067-86af-4845ce487034-serving-cert\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.971083 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnkgs\" (UniqueName: \"kubernetes.io/projected/26f72ebb-236d-48df-8a92-a3c8a81ad5ac-kube-api-access-pnkgs\") pod \"route-controller-manager-6b5c45c56-hdzhn\" (UID: \"26f72ebb-236d-48df-8a92-a3c8a81ad5ac\") " pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.973097 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnkkd\" (UniqueName: \"kubernetes.io/projected/1b681576-9448-4067-86af-4845ce487034-kube-api-access-cnkkd\") pod \"controller-manager-cf4c4949d-dzvnd\" (UID: \"1b681576-9448-4067-86af-4845ce487034\") " pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.994476 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:11 crc kubenswrapper[4594]: I1129 05:34:11.999438 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.091616 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66e4aafc-b2a5-4f3f-926d-818c48d99f10" path="/var/lib/kubelet/pods/66e4aafc-b2a5-4f3f-926d-818c48d99f10/volumes" Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.092421 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8" path="/var/lib/kubelet/pods/f74e0bd8-e2f7-4ce6-9fb0-538d0b45a5c8/volumes" Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.362193 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn"] Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.401498 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cf4c4949d-dzvnd"] Nov 29 05:34:12 crc kubenswrapper[4594]: W1129 05:34:12.410523 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b681576_9448_4067_86af_4845ce487034.slice/crio-93b0ce5397f01a67c13953eda19b5d436464230aee4f9a8e8757c21616647aa2 WatchSource:0}: Error finding container 93b0ce5397f01a67c13953eda19b5d436464230aee4f9a8e8757c21616647aa2: Status 404 returned error can't find the container with id 93b0ce5397f01a67c13953eda19b5d436464230aee4f9a8e8757c21616647aa2 Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.816350 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" event={"ID":"26f72ebb-236d-48df-8a92-a3c8a81ad5ac","Type":"ContainerStarted","Data":"c0118da3a68eb78089c6e8becc3ab1001e19fb8e044e0d0651ecc1953b9689e1"} Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.816407 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" event={"ID":"26f72ebb-236d-48df-8a92-a3c8a81ad5ac","Type":"ContainerStarted","Data":"71729f9669a1f6ab8dee61bf35ecc72b0ac31539b55374d200746c66f0a2ae2c"} Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.816574 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.818133 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" event={"ID":"1b681576-9448-4067-86af-4845ce487034","Type":"ContainerStarted","Data":"8da74c3e01cb237aea03e43aefeba3ce023110e66cdf2001a5ef80211c1a6536"} Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.818160 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" event={"ID":"1b681576-9448-4067-86af-4845ce487034","Type":"ContainerStarted","Data":"93b0ce5397f01a67c13953eda19b5d436464230aee4f9a8e8757c21616647aa2"} Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.818376 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.823659 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.825929 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.835903 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b5c45c56-hdzhn" podStartSLOduration=2.8358914520000003 podStartE2EDuration="2.835891452s" podCreationTimestamp="2025-11-29 05:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:34:12.833873855 +0000 UTC m=+377.074383075" watchObservedRunningTime="2025-11-29 05:34:12.835891452 +0000 UTC m=+377.076400662" Nov 29 05:34:12 crc kubenswrapper[4594]: I1129 05:34:12.850191 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cf4c4949d-dzvnd" podStartSLOduration=2.85017884 podStartE2EDuration="2.85017884s" podCreationTimestamp="2025-11-29 05:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:34:12.84600241 +0000 UTC m=+377.086511620" watchObservedRunningTime="2025-11-29 05:34:12.85017884 +0000 UTC m=+377.090688060" Nov 29 05:34:15 crc kubenswrapper[4594]: I1129 05:34:15.800396 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:34:15 crc kubenswrapper[4594]: I1129 05:34:15.800748 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:34:26 crc kubenswrapper[4594]: I1129 05:34:26.418820 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-h5bvf" Nov 29 05:34:26 crc kubenswrapper[4594]: I1129 05:34:26.459454 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-chxbx"] Nov 29 05:34:45 crc kubenswrapper[4594]: I1129 05:34:45.800595 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:34:45 crc kubenswrapper[4594]: I1129 05:34:45.801447 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:34:51 crc kubenswrapper[4594]: I1129 05:34:51.927640 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" podUID="0945b059-dd2b-4550-8b8a-49ea4e94ffeb" containerName="registry" containerID="cri-o://a35fea000c510be2f158dc805929a5e48550046a16d8fec2a100dc5eed9250ca" gracePeriod=30 Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.275295 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.469820 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-trusted-ca\") pod \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.469858 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-ca-trust-extracted\") pod \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.469888 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-registry-tls\") pod \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.469913 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzfst\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-kube-api-access-zzfst\") pod \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.469931 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-installation-pull-secrets\") pod \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.469946 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-bound-sa-token\") pod \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.470188 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.470213 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-registry-certificates\") pod \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\" (UID: \"0945b059-dd2b-4550-8b8a-49ea4e94ffeb\") " Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.470891 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0945b059-dd2b-4550-8b8a-49ea4e94ffeb" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.470925 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0945b059-dd2b-4550-8b8a-49ea4e94ffeb" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.471327 4594 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.471366 4594 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.475186 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0945b059-dd2b-4550-8b8a-49ea4e94ffeb" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.475473 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0945b059-dd2b-4550-8b8a-49ea4e94ffeb" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.475551 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0945b059-dd2b-4550-8b8a-49ea4e94ffeb" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.475847 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-kube-api-access-zzfst" (OuterVolumeSpecName: "kube-api-access-zzfst") pod "0945b059-dd2b-4550-8b8a-49ea4e94ffeb" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb"). InnerVolumeSpecName "kube-api-access-zzfst". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.477308 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "0945b059-dd2b-4550-8b8a-49ea4e94ffeb" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.482441 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0945b059-dd2b-4550-8b8a-49ea4e94ffeb" (UID: "0945b059-dd2b-4550-8b8a-49ea4e94ffeb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.572587 4594 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.572615 4594 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.572624 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzfst\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-kube-api-access-zzfst\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.572634 4594 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:52 crc kubenswrapper[4594]: I1129 05:34:52.572644 4594 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0945b059-dd2b-4550-8b8a-49ea4e94ffeb-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 29 05:34:53 crc kubenswrapper[4594]: I1129 05:34:53.033492 4594 generic.go:334] "Generic (PLEG): container finished" podID="0945b059-dd2b-4550-8b8a-49ea4e94ffeb" containerID="a35fea000c510be2f158dc805929a5e48550046a16d8fec2a100dc5eed9250ca" exitCode=0 Nov 29 05:34:53 crc kubenswrapper[4594]: I1129 05:34:53.033546 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" event={"ID":"0945b059-dd2b-4550-8b8a-49ea4e94ffeb","Type":"ContainerDied","Data":"a35fea000c510be2f158dc805929a5e48550046a16d8fec2a100dc5eed9250ca"} Nov 29 05:34:53 crc kubenswrapper[4594]: I1129 05:34:53.033582 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" event={"ID":"0945b059-dd2b-4550-8b8a-49ea4e94ffeb","Type":"ContainerDied","Data":"0909f865542aa26ef26c7070d0c942b66efa64e42e0f14ac0e800c78fdb025a0"} Nov 29 05:34:53 crc kubenswrapper[4594]: I1129 05:34:53.033603 4594 scope.go:117] "RemoveContainer" containerID="a35fea000c510be2f158dc805929a5e48550046a16d8fec2a100dc5eed9250ca" Nov 29 05:34:53 crc kubenswrapper[4594]: I1129 05:34:53.034330 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-chxbx" Nov 29 05:34:53 crc kubenswrapper[4594]: I1129 05:34:53.051482 4594 scope.go:117] "RemoveContainer" containerID="a35fea000c510be2f158dc805929a5e48550046a16d8fec2a100dc5eed9250ca" Nov 29 05:34:53 crc kubenswrapper[4594]: E1129 05:34:53.051956 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a35fea000c510be2f158dc805929a5e48550046a16d8fec2a100dc5eed9250ca\": container with ID starting with a35fea000c510be2f158dc805929a5e48550046a16d8fec2a100dc5eed9250ca not found: ID does not exist" containerID="a35fea000c510be2f158dc805929a5e48550046a16d8fec2a100dc5eed9250ca" Nov 29 05:34:53 crc kubenswrapper[4594]: I1129 05:34:53.051991 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35fea000c510be2f158dc805929a5e48550046a16d8fec2a100dc5eed9250ca"} err="failed to get container status \"a35fea000c510be2f158dc805929a5e48550046a16d8fec2a100dc5eed9250ca\": rpc error: code = NotFound desc = could not find container \"a35fea000c510be2f158dc805929a5e48550046a16d8fec2a100dc5eed9250ca\": container with ID starting with a35fea000c510be2f158dc805929a5e48550046a16d8fec2a100dc5eed9250ca not found: ID does not exist" Nov 29 05:34:53 crc kubenswrapper[4594]: I1129 05:34:53.059315 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-chxbx"] Nov 29 05:34:53 crc kubenswrapper[4594]: I1129 05:34:53.062062 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-chxbx"] Nov 29 05:34:54 crc kubenswrapper[4594]: I1129 05:34:54.088834 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0945b059-dd2b-4550-8b8a-49ea4e94ffeb" path="/var/lib/kubelet/pods/0945b059-dd2b-4550-8b8a-49ea4e94ffeb/volumes" Nov 29 05:35:15 crc kubenswrapper[4594]: I1129 05:35:15.800046 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:35:15 crc kubenswrapper[4594]: I1129 05:35:15.800593 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:35:15 crc kubenswrapper[4594]: I1129 05:35:15.800628 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:35:15 crc kubenswrapper[4594]: I1129 05:35:15.801013 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ca701b866c00b1fe102c52d2dacfbb9215d913bbee855fff76b7d8b7e0bf21d"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 05:35:15 crc kubenswrapper[4594]: I1129 05:35:15.801059 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://8ca701b866c00b1fe102c52d2dacfbb9215d913bbee855fff76b7d8b7e0bf21d" gracePeriod=600 Nov 29 05:35:16 crc kubenswrapper[4594]: I1129 05:35:16.142636 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="8ca701b866c00b1fe102c52d2dacfbb9215d913bbee855fff76b7d8b7e0bf21d" exitCode=0 Nov 29 05:35:16 crc kubenswrapper[4594]: I1129 05:35:16.142714 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"8ca701b866c00b1fe102c52d2dacfbb9215d913bbee855fff76b7d8b7e0bf21d"} Nov 29 05:35:16 crc kubenswrapper[4594]: I1129 05:35:16.142920 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"7993c3c5c143399d1d6e2423a11b6f37b4521819ab268ad6d3a203f1a0f238a6"} Nov 29 05:35:16 crc kubenswrapper[4594]: I1129 05:35:16.142968 4594 scope.go:117] "RemoveContainer" containerID="eb25cb1d2d2a3aa84bc6fca3087cdf54b3b8e75bc258441b2895f37594df44e5" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.587901 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-btrfm"] Nov 29 05:36:15 crc kubenswrapper[4594]: E1129 05:36:15.588737 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0945b059-dd2b-4550-8b8a-49ea4e94ffeb" containerName="registry" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.588751 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0945b059-dd2b-4550-8b8a-49ea4e94ffeb" containerName="registry" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.588840 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="0945b059-dd2b-4550-8b8a-49ea4e94ffeb" containerName="registry" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.589180 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-btrfm" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.590566 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.590679 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.591463 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4z2w6"] Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.591873 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4z2w6" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.592161 4594 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-kkngt" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.593809 4594 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-4j575" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.616277 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmrh8\" (UniqueName: \"kubernetes.io/projected/edf7ce03-4be9-42d1-8a58-e0f132c43299-kube-api-access-gmrh8\") pod \"cert-manager-5b446d88c5-4z2w6\" (UID: \"edf7ce03-4be9-42d1-8a58-e0f132c43299\") " pod="cert-manager/cert-manager-5b446d88c5-4z2w6" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.616334 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx2rr\" (UniqueName: \"kubernetes.io/projected/f4838603-21ba-451a-a9d4-d5415bc4b52a-kube-api-access-mx2rr\") pod \"cert-manager-cainjector-7f985d654d-btrfm\" (UID: \"f4838603-21ba-451a-a9d4-d5415bc4b52a\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-btrfm" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.639745 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-4srms"] Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.640408 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-4srms" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.649552 4594 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-8hpxr" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.657912 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4z2w6"] Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.677222 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-4srms"] Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.699873 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-btrfm"] Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.717435 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmrh8\" (UniqueName: \"kubernetes.io/projected/edf7ce03-4be9-42d1-8a58-e0f132c43299-kube-api-access-gmrh8\") pod \"cert-manager-5b446d88c5-4z2w6\" (UID: \"edf7ce03-4be9-42d1-8a58-e0f132c43299\") " pod="cert-manager/cert-manager-5b446d88c5-4z2w6" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.717481 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx2rr\" (UniqueName: \"kubernetes.io/projected/f4838603-21ba-451a-a9d4-d5415bc4b52a-kube-api-access-mx2rr\") pod \"cert-manager-cainjector-7f985d654d-btrfm\" (UID: \"f4838603-21ba-451a-a9d4-d5415bc4b52a\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-btrfm" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.734012 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx2rr\" (UniqueName: \"kubernetes.io/projected/f4838603-21ba-451a-a9d4-d5415bc4b52a-kube-api-access-mx2rr\") pod \"cert-manager-cainjector-7f985d654d-btrfm\" (UID: \"f4838603-21ba-451a-a9d4-d5415bc4b52a\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-btrfm" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.734369 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmrh8\" (UniqueName: \"kubernetes.io/projected/edf7ce03-4be9-42d1-8a58-e0f132c43299-kube-api-access-gmrh8\") pod \"cert-manager-5b446d88c5-4z2w6\" (UID: \"edf7ce03-4be9-42d1-8a58-e0f132c43299\") " pod="cert-manager/cert-manager-5b446d88c5-4z2w6" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.818920 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcccc\" (UniqueName: \"kubernetes.io/projected/f288eb1f-58a0-4e9b-9b63-ba15bedb38ec-kube-api-access-hcccc\") pod \"cert-manager-webhook-5655c58dd6-4srms\" (UID: \"f288eb1f-58a0-4e9b-9b63-ba15bedb38ec\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-4srms" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.908582 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-btrfm" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.920041 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4z2w6" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.920838 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcccc\" (UniqueName: \"kubernetes.io/projected/f288eb1f-58a0-4e9b-9b63-ba15bedb38ec-kube-api-access-hcccc\") pod \"cert-manager-webhook-5655c58dd6-4srms\" (UID: \"f288eb1f-58a0-4e9b-9b63-ba15bedb38ec\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-4srms" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.938809 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcccc\" (UniqueName: \"kubernetes.io/projected/f288eb1f-58a0-4e9b-9b63-ba15bedb38ec-kube-api-access-hcccc\") pod \"cert-manager-webhook-5655c58dd6-4srms\" (UID: \"f288eb1f-58a0-4e9b-9b63-ba15bedb38ec\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-4srms" Nov 29 05:36:15 crc kubenswrapper[4594]: I1129 05:36:15.950701 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-4srms" Nov 29 05:36:16 crc kubenswrapper[4594]: I1129 05:36:16.281681 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4z2w6"] Nov 29 05:36:16 crc kubenswrapper[4594]: I1129 05:36:16.284638 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-btrfm"] Nov 29 05:36:16 crc kubenswrapper[4594]: I1129 05:36:16.294351 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 05:36:16 crc kubenswrapper[4594]: I1129 05:36:16.353649 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-4srms"] Nov 29 05:36:16 crc kubenswrapper[4594]: W1129 05:36:16.357051 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf288eb1f_58a0_4e9b_9b63_ba15bedb38ec.slice/crio-459908a32433098efe08f7752b0edccd868123c2ed7d7545f8c61e3f1c6a8e36 WatchSource:0}: Error finding container 459908a32433098efe08f7752b0edccd868123c2ed7d7545f8c61e3f1c6a8e36: Status 404 returned error can't find the container with id 459908a32433098efe08f7752b0edccd868123c2ed7d7545f8c61e3f1c6a8e36 Nov 29 05:36:16 crc kubenswrapper[4594]: I1129 05:36:16.392038 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-4srms" event={"ID":"f288eb1f-58a0-4e9b-9b63-ba15bedb38ec","Type":"ContainerStarted","Data":"459908a32433098efe08f7752b0edccd868123c2ed7d7545f8c61e3f1c6a8e36"} Nov 29 05:36:16 crc kubenswrapper[4594]: I1129 05:36:16.393046 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-btrfm" event={"ID":"f4838603-21ba-451a-a9d4-d5415bc4b52a","Type":"ContainerStarted","Data":"847076b10b8a80c8d38d98c192a72f3345918e183b83e84b422ce8667627798d"} Nov 29 05:36:16 crc kubenswrapper[4594]: I1129 05:36:16.393738 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4z2w6" event={"ID":"edf7ce03-4be9-42d1-8a58-e0f132c43299","Type":"ContainerStarted","Data":"b15ba354ef209ae6c6df6ba835badda2b6faf18f1e94dc73c06acdb3c929d9fa"} Nov 29 05:36:19 crc kubenswrapper[4594]: I1129 05:36:19.409340 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-btrfm" event={"ID":"f4838603-21ba-451a-a9d4-d5415bc4b52a","Type":"ContainerStarted","Data":"fe92a4ddde8dcb65060bb3a5eba28aab691b93a44de8272c0e0e8a959362d04c"} Nov 29 05:36:19 crc kubenswrapper[4594]: I1129 05:36:19.430514 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-btrfm" podStartSLOduration=2.20951387 podStartE2EDuration="4.430495444s" podCreationTimestamp="2025-11-29 05:36:15 +0000 UTC" firstStartedPulling="2025-11-29 05:36:16.294225347 +0000 UTC m=+500.534734567" lastFinishedPulling="2025-11-29 05:36:18.515206921 +0000 UTC m=+502.755716141" observedRunningTime="2025-11-29 05:36:19.429491916 +0000 UTC m=+503.670001136" watchObservedRunningTime="2025-11-29 05:36:19.430495444 +0000 UTC m=+503.671004665" Nov 29 05:36:20 crc kubenswrapper[4594]: I1129 05:36:20.414152 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4z2w6" event={"ID":"edf7ce03-4be9-42d1-8a58-e0f132c43299","Type":"ContainerStarted","Data":"d0d1304bac789b7858a2c0a6c643febe16982e61944508f6171c13fdc3ad5399"} Nov 29 05:36:20 crc kubenswrapper[4594]: I1129 05:36:20.416268 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-4srms" event={"ID":"f288eb1f-58a0-4e9b-9b63-ba15bedb38ec","Type":"ContainerStarted","Data":"05812b63eedd5c99062f403fcdfba84bd61314b1cd322790e0557fedd3375b7b"} Nov 29 05:36:20 crc kubenswrapper[4594]: I1129 05:36:20.416590 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-4srms" Nov 29 05:36:20 crc kubenswrapper[4594]: I1129 05:36:20.426840 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-4z2w6" podStartSLOduration=2.338999318 podStartE2EDuration="5.426813134s" podCreationTimestamp="2025-11-29 05:36:15 +0000 UTC" firstStartedPulling="2025-11-29 05:36:16.293994532 +0000 UTC m=+500.534503753" lastFinishedPulling="2025-11-29 05:36:19.381808349 +0000 UTC m=+503.622317569" observedRunningTime="2025-11-29 05:36:20.425488032 +0000 UTC m=+504.665997252" watchObservedRunningTime="2025-11-29 05:36:20.426813134 +0000 UTC m=+504.667322354" Nov 29 05:36:20 crc kubenswrapper[4594]: I1129 05:36:20.441600 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-4srms" podStartSLOduration=2.3928387620000002 podStartE2EDuration="5.441581361s" podCreationTimestamp="2025-11-29 05:36:15 +0000 UTC" firstStartedPulling="2025-11-29 05:36:16.35918556 +0000 UTC m=+500.599694769" lastFinishedPulling="2025-11-29 05:36:19.407928147 +0000 UTC m=+503.648437368" observedRunningTime="2025-11-29 05:36:20.441195295 +0000 UTC m=+504.681704515" watchObservedRunningTime="2025-11-29 05:36:20.441581361 +0000 UTC m=+504.682090581" Nov 29 05:36:25 crc kubenswrapper[4594]: I1129 05:36:25.953520 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-4srms" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.124181 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lp4zm"] Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.124553 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovn-controller" containerID="cri-o://9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d" gracePeriod=30 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.124610 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="nbdb" containerID="cri-o://7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae" gracePeriod=30 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.124664 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="kube-rbac-proxy-node" containerID="cri-o://b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135" gracePeriod=30 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.124668 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38" gracePeriod=30 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.124779 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="northd" containerID="cri-o://77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f" gracePeriod=30 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.124844 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovn-acl-logging" containerID="cri-o://78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6" gracePeriod=30 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.126663 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="sbdb" containerID="cri-o://7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004" gracePeriod=30 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.147692 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" containerID="cri-o://c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53" gracePeriod=30 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.390531 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/3.log" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.392716 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovn-acl-logging/0.log" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.393110 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovn-controller/0.log" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.393546 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.436563 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z2tw9"] Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.436804 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovn-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.436822 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovn-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.436838 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="kubecfg-setup" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.436844 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="kubecfg-setup" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.436850 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="nbdb" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.436857 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="nbdb" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.436867 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="northd" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.436872 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="northd" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.436879 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.436885 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.436894 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="sbdb" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.436900 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="sbdb" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.436910 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="kube-rbac-proxy-node" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.436916 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="kube-rbac-proxy-node" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.436927 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.436935 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.436943 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovn-acl-logging" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.436949 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovn-acl-logging" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.436956 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.436962 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.436970 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="kube-rbac-proxy-ovn-metrics" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.436975 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="kube-rbac-proxy-ovn-metrics" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.436982 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.436988 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.437108 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.437117 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovn-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.437125 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.437132 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="sbdb" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.437139 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="nbdb" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.437146 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="kube-rbac-proxy-ovn-metrics" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.437153 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.437160 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovn-acl-logging" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.437167 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="northd" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.437174 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.437181 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.437190 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="kube-rbac-proxy-node" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.437333 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.437341 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerName="ovnkube-controller" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.439171 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.453860 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-openvswitch\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.453983 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-etc-openvswitch\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.453987 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454027 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-systemd\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454080 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454102 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-cni-netd\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454133 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-kubelet\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454158 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-slash\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454200 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-env-overrides\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454233 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovnkube-config\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454277 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-systemd-units\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454362 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovnkube-controller/3.log" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454417 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-run-netns\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454469 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-log-socket\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454469 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454501 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-var-lib-openvswitch\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454524 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454552 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454561 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-ovn\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454625 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovn-node-metrics-cert\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454652 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-run-ovn-kubernetes\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454702 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovnkube-script-lib\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454722 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-cni-bin\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454770 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjws8\" (UniqueName: \"kubernetes.io/projected/08de5891-72ca-488c-80b3-6b54c8c1a66e-kube-api-access-cjws8\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454818 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-node-log\") pod \"08de5891-72ca-488c-80b3-6b54c8c1a66e\" (UID: \"08de5891-72ca-488c-80b3-6b54c8c1a66e\") " Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454646 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454676 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454710 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454713 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454739 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-slash" (OuterVolumeSpecName: "host-slash") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454740 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454738 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-log-socket" (OuterVolumeSpecName: "log-socket") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454760 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.454795 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455007 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-node-log" (OuterVolumeSpecName: "node-log") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455176 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-run-openvswitch\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455225 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455283 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-log-socket\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455360 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-node-log\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455412 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-run-ovn\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455470 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d9055c2-3e35-408e-9426-2f471a991a3e-ovnkube-script-lib\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455426 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455438 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455501 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-etc-openvswitch\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455619 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-cni-netd\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455656 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-systemd-units\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455697 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d9055c2-3e35-408e-9426-2f471a991a3e-env-overrides\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455807 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-slash\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455869 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-cni-bin\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455890 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455915 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb7qz\" (UniqueName: \"kubernetes.io/projected/3d9055c2-3e35-408e-9426-2f471a991a3e-kube-api-access-fb7qz\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455941 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-var-lib-openvswitch\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.455995 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d9055c2-3e35-408e-9426-2f471a991a3e-ovnkube-config\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456034 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-run-netns\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456090 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d9055c2-3e35-408e-9426-2f471a991a3e-ovn-node-metrics-cert\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456138 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-run-systemd\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456198 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-kubelet\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456308 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-run-ovn-kubernetes\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456424 4594 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456442 4594 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456456 4594 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456467 4594 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456477 4594 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-slash\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456487 4594 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456496 4594 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456505 4594 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456516 4594 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456525 4594 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-log-socket\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456534 4594 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456544 4594 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456552 4594 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456561 4594 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456569 4594 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456578 4594 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-node-log\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.456589 4594 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.458131 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovn-acl-logging/0.log" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.458569 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lp4zm_08de5891-72ca-488c-80b3-6b54c8c1a66e/ovn-controller/0.log" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.459021 4594 generic.go:334] "Generic (PLEG): container finished" podID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerID="c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53" exitCode=0 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.459049 4594 generic.go:334] "Generic (PLEG): container finished" podID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerID="7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004" exitCode=0 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.459057 4594 generic.go:334] "Generic (PLEG): container finished" podID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerID="7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae" exitCode=0 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.459065 4594 generic.go:334] "Generic (PLEG): container finished" podID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerID="77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f" exitCode=0 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.459072 4594 generic.go:334] "Generic (PLEG): container finished" podID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerID="4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38" exitCode=0 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.459080 4594 generic.go:334] "Generic (PLEG): container finished" podID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerID="b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135" exitCode=0 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.459088 4594 generic.go:334] "Generic (PLEG): container finished" podID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerID="78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6" exitCode=143 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.459096 4594 generic.go:334] "Generic (PLEG): container finished" podID="08de5891-72ca-488c-80b3-6b54c8c1a66e" containerID="9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d" exitCode=143 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.459115 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.459108 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460159 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460185 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460200 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460212 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460227 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460246 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460288 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460296 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460304 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460310 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460318 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460325 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460333 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460339 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460346 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460357 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460369 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460374 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460390 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460268 4594 scope.go:117] "RemoveContainer" containerID="c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460400 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460514 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460535 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460541 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460548 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460553 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460576 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460600 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460607 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460612 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460617 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460623 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460628 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460632 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460637 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460644 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460649 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460656 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lp4zm" event={"ID":"08de5891-72ca-488c-80b3-6b54c8c1a66e","Type":"ContainerDied","Data":"ebad62949c66a06bbc352e12d8a068e3240c97cfceef191222d7dc266badfcd2"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460666 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460673 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460678 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460683 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460688 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460693 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460698 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460704 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460709 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.460714 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.464588 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7plzz_e5052790-d231-4f97-802c-c7de3cd72561/kube-multus/2.log" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.465054 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7plzz_e5052790-d231-4f97-802c-c7de3cd72561/kube-multus/1.log" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.465119 4594 generic.go:334] "Generic (PLEG): container finished" podID="e5052790-d231-4f97-802c-c7de3cd72561" containerID="fe1439b8aee18c36fcc41d5a5289107a458927f46cfb33947018b6cd3f7bee68" exitCode=2 Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.465167 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7plzz" event={"ID":"e5052790-d231-4f97-802c-c7de3cd72561","Type":"ContainerDied","Data":"fe1439b8aee18c36fcc41d5a5289107a458927f46cfb33947018b6cd3f7bee68"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.465197 4594 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"abc4e229bc1c5698fec5a7e0da8decb0167770496e7f8dce7a9a9276721ee26c"} Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.465818 4594 scope.go:117] "RemoveContainer" containerID="fe1439b8aee18c36fcc41d5a5289107a458927f46cfb33947018b6cd3f7bee68" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.466011 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7plzz_openshift-multus(e5052790-d231-4f97-802c-c7de3cd72561)\"" pod="openshift-multus/multus-7plzz" podUID="e5052790-d231-4f97-802c-c7de3cd72561" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.466927 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08de5891-72ca-488c-80b3-6b54c8c1a66e-kube-api-access-cjws8" (OuterVolumeSpecName: "kube-api-access-cjws8") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "kube-api-access-cjws8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.466951 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.472779 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "08de5891-72ca-488c-80b3-6b54c8c1a66e" (UID: "08de5891-72ca-488c-80b3-6b54c8c1a66e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.489310 4594 scope.go:117] "RemoveContainer" containerID="1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.501441 4594 scope.go:117] "RemoveContainer" containerID="7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.515440 4594 scope.go:117] "RemoveContainer" containerID="7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.528473 4594 scope.go:117] "RemoveContainer" containerID="77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.540551 4594 scope.go:117] "RemoveContainer" containerID="4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.552168 4594 scope.go:117] "RemoveContainer" containerID="b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557185 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-slash\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557237 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb7qz\" (UniqueName: \"kubernetes.io/projected/3d9055c2-3e35-408e-9426-2f471a991a3e-kube-api-access-fb7qz\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557287 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-var-lib-openvswitch\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557306 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-cni-bin\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557355 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557373 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d9055c2-3e35-408e-9426-2f471a991a3e-ovnkube-config\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557399 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-run-netns\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557440 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d9055c2-3e35-408e-9426-2f471a991a3e-ovn-node-metrics-cert\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557461 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-run-systemd\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557479 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-kubelet\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557533 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-run-ovn-kubernetes\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557556 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-run-openvswitch\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557601 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-log-socket\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557619 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-node-log\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557636 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-run-ovn\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557666 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-etc-openvswitch\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557683 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d9055c2-3e35-408e-9426-2f471a991a3e-ovnkube-script-lib\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557725 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-cni-netd\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557758 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-systemd-units\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557760 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-run-systemd\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557774 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d9055c2-3e35-408e-9426-2f471a991a3e-env-overrides\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557936 4594 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08de5891-72ca-488c-80b3-6b54c8c1a66e-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557956 4594 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08de5891-72ca-488c-80b3-6b54c8c1a66e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557969 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjws8\" (UniqueName: \"kubernetes.io/projected/08de5891-72ca-488c-80b3-6b54c8c1a66e-kube-api-access-cjws8\") on node \"crc\" DevicePath \"\"" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557981 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-cni-bin\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.557995 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.558033 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-slash\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.558353 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d9055c2-3e35-408e-9426-2f471a991a3e-env-overrides\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.558411 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-kubelet\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.558727 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-run-ovn\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.558744 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-var-lib-openvswitch\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.558811 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-run-netns\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.558840 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-run-openvswitch\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.558809 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-run-ovn-kubernetes\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.558982 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-host-cni-netd\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.559012 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-systemd-units\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.558813 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-etc-openvswitch\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.559151 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-log-socket\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.559368 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d9055c2-3e35-408e-9426-2f471a991a3e-ovnkube-script-lib\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.560513 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d9055c2-3e35-408e-9426-2f471a991a3e-node-log\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.561467 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d9055c2-3e35-408e-9426-2f471a991a3e-ovnkube-config\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.561909 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d9055c2-3e35-408e-9426-2f471a991a3e-ovn-node-metrics-cert\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.568580 4594 scope.go:117] "RemoveContainer" containerID="78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.573884 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb7qz\" (UniqueName: \"kubernetes.io/projected/3d9055c2-3e35-408e-9426-2f471a991a3e-kube-api-access-fb7qz\") pod \"ovnkube-node-z2tw9\" (UID: \"3d9055c2-3e35-408e-9426-2f471a991a3e\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.580124 4594 scope.go:117] "RemoveContainer" containerID="9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.590509 4594 scope.go:117] "RemoveContainer" containerID="b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.600138 4594 scope.go:117] "RemoveContainer" containerID="c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.600519 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53\": container with ID starting with c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53 not found: ID does not exist" containerID="c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.600558 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53"} err="failed to get container status \"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53\": rpc error: code = NotFound desc = could not find container \"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53\": container with ID starting with c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.600591 4594 scope.go:117] "RemoveContainer" containerID="1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.600856 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\": container with ID starting with 1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85 not found: ID does not exist" containerID="1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.600894 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85"} err="failed to get container status \"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\": rpc error: code = NotFound desc = could not find container \"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\": container with ID starting with 1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.600916 4594 scope.go:117] "RemoveContainer" containerID="7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.601096 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\": container with ID starting with 7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004 not found: ID does not exist" containerID="7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.601116 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004"} err="failed to get container status \"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\": rpc error: code = NotFound desc = could not find container \"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\": container with ID starting with 7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.601129 4594 scope.go:117] "RemoveContainer" containerID="7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.601495 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\": container with ID starting with 7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae not found: ID does not exist" containerID="7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.601521 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae"} err="failed to get container status \"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\": rpc error: code = NotFound desc = could not find container \"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\": container with ID starting with 7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.601537 4594 scope.go:117] "RemoveContainer" containerID="77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.601791 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\": container with ID starting with 77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f not found: ID does not exist" containerID="77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.601819 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f"} err="failed to get container status \"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\": rpc error: code = NotFound desc = could not find container \"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\": container with ID starting with 77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.601838 4594 scope.go:117] "RemoveContainer" containerID="4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.602042 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\": container with ID starting with 4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38 not found: ID does not exist" containerID="4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.602067 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38"} err="failed to get container status \"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\": rpc error: code = NotFound desc = could not find container \"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\": container with ID starting with 4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.602081 4594 scope.go:117] "RemoveContainer" containerID="b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.602300 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\": container with ID starting with b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135 not found: ID does not exist" containerID="b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.602325 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135"} err="failed to get container status \"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\": rpc error: code = NotFound desc = could not find container \"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\": container with ID starting with b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.602342 4594 scope.go:117] "RemoveContainer" containerID="78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.602556 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\": container with ID starting with 78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6 not found: ID does not exist" containerID="78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.602579 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6"} err="failed to get container status \"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\": rpc error: code = NotFound desc = could not find container \"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\": container with ID starting with 78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.602592 4594 scope.go:117] "RemoveContainer" containerID="9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.602858 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\": container with ID starting with 9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d not found: ID does not exist" containerID="9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.602880 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d"} err="failed to get container status \"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\": rpc error: code = NotFound desc = could not find container \"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\": container with ID starting with 9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.602896 4594 scope.go:117] "RemoveContainer" containerID="b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898" Nov 29 05:36:27 crc kubenswrapper[4594]: E1129 05:36:27.603222 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\": container with ID starting with b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898 not found: ID does not exist" containerID="b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.603304 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898"} err="failed to get container status \"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\": rpc error: code = NotFound desc = could not find container \"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\": container with ID starting with b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.603332 4594 scope.go:117] "RemoveContainer" containerID="c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.603613 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53"} err="failed to get container status \"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53\": rpc error: code = NotFound desc = could not find container \"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53\": container with ID starting with c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.603632 4594 scope.go:117] "RemoveContainer" containerID="1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.603964 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85"} err="failed to get container status \"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\": rpc error: code = NotFound desc = could not find container \"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\": container with ID starting with 1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.603984 4594 scope.go:117] "RemoveContainer" containerID="7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.604194 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004"} err="failed to get container status \"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\": rpc error: code = NotFound desc = could not find container \"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\": container with ID starting with 7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.604212 4594 scope.go:117] "RemoveContainer" containerID="7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.604505 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae"} err="failed to get container status \"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\": rpc error: code = NotFound desc = could not find container \"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\": container with ID starting with 7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.604526 4594 scope.go:117] "RemoveContainer" containerID="77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.604793 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f"} err="failed to get container status \"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\": rpc error: code = NotFound desc = could not find container \"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\": container with ID starting with 77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.604821 4594 scope.go:117] "RemoveContainer" containerID="4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.605042 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38"} err="failed to get container status \"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\": rpc error: code = NotFound desc = could not find container \"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\": container with ID starting with 4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.605090 4594 scope.go:117] "RemoveContainer" containerID="b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.605313 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135"} err="failed to get container status \"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\": rpc error: code = NotFound desc = could not find container \"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\": container with ID starting with b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.605332 4594 scope.go:117] "RemoveContainer" containerID="78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.605542 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6"} err="failed to get container status \"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\": rpc error: code = NotFound desc = could not find container \"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\": container with ID starting with 78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.605563 4594 scope.go:117] "RemoveContainer" containerID="9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.605744 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d"} err="failed to get container status \"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\": rpc error: code = NotFound desc = could not find container \"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\": container with ID starting with 9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.605761 4594 scope.go:117] "RemoveContainer" containerID="b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.605926 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898"} err="failed to get container status \"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\": rpc error: code = NotFound desc = could not find container \"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\": container with ID starting with b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.605943 4594 scope.go:117] "RemoveContainer" containerID="c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.606098 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53"} err="failed to get container status \"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53\": rpc error: code = NotFound desc = could not find container \"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53\": container with ID starting with c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.606116 4594 scope.go:117] "RemoveContainer" containerID="1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.606411 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85"} err="failed to get container status \"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\": rpc error: code = NotFound desc = could not find container \"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\": container with ID starting with 1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.606431 4594 scope.go:117] "RemoveContainer" containerID="7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.606622 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004"} err="failed to get container status \"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\": rpc error: code = NotFound desc = could not find container \"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\": container with ID starting with 7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.606640 4594 scope.go:117] "RemoveContainer" containerID="7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.606819 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae"} err="failed to get container status \"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\": rpc error: code = NotFound desc = could not find container \"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\": container with ID starting with 7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.606840 4594 scope.go:117] "RemoveContainer" containerID="77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.607172 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f"} err="failed to get container status \"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\": rpc error: code = NotFound desc = could not find container \"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\": container with ID starting with 77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.607193 4594 scope.go:117] "RemoveContainer" containerID="4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.607399 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38"} err="failed to get container status \"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\": rpc error: code = NotFound desc = could not find container \"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\": container with ID starting with 4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.607418 4594 scope.go:117] "RemoveContainer" containerID="b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.607590 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135"} err="failed to get container status \"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\": rpc error: code = NotFound desc = could not find container \"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\": container with ID starting with b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.607613 4594 scope.go:117] "RemoveContainer" containerID="78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.607798 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6"} err="failed to get container status \"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\": rpc error: code = NotFound desc = could not find container \"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\": container with ID starting with 78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.607817 4594 scope.go:117] "RemoveContainer" containerID="9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.608003 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d"} err="failed to get container status \"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\": rpc error: code = NotFound desc = could not find container \"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\": container with ID starting with 9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.608022 4594 scope.go:117] "RemoveContainer" containerID="b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.608198 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898"} err="failed to get container status \"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\": rpc error: code = NotFound desc = could not find container \"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\": container with ID starting with b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.608215 4594 scope.go:117] "RemoveContainer" containerID="c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.608396 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53"} err="failed to get container status \"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53\": rpc error: code = NotFound desc = could not find container \"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53\": container with ID starting with c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.608413 4594 scope.go:117] "RemoveContainer" containerID="1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.608576 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85"} err="failed to get container status \"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\": rpc error: code = NotFound desc = could not find container \"1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85\": container with ID starting with 1fb430ec13486fa454ce5a026c9035daca3b18f8f31f2304bb6c5eb544778a85 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.608597 4594 scope.go:117] "RemoveContainer" containerID="7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.608767 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004"} err="failed to get container status \"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\": rpc error: code = NotFound desc = could not find container \"7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004\": container with ID starting with 7cc8b34a52032d4716652fc9edcc35dd797309bf6886242671c068e65399d004 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.608784 4594 scope.go:117] "RemoveContainer" containerID="7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.608935 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae"} err="failed to get container status \"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\": rpc error: code = NotFound desc = could not find container \"7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae\": container with ID starting with 7a67e6b494618e268d6f5d75507c1cfe0e8e99c0e3cfff7548756a64f4f3a3ae not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.608952 4594 scope.go:117] "RemoveContainer" containerID="77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.609128 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f"} err="failed to get container status \"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\": rpc error: code = NotFound desc = could not find container \"77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f\": container with ID starting with 77543b04b619c48bf800a3d9dda1a969e4e32aa113a49287717a9e83f9cff96f not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.609148 4594 scope.go:117] "RemoveContainer" containerID="4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.609337 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38"} err="failed to get container status \"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\": rpc error: code = NotFound desc = could not find container \"4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38\": container with ID starting with 4cf2a15e6478f9f461a9927f570db4a81a8d752c8be46c7970b4956ae2031a38 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.609355 4594 scope.go:117] "RemoveContainer" containerID="b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.609554 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135"} err="failed to get container status \"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\": rpc error: code = NotFound desc = could not find container \"b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135\": container with ID starting with b9f96824f977ec455622772e91ed8e6534dc93a71868e2b5e2aa9302ff9be135 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.609574 4594 scope.go:117] "RemoveContainer" containerID="78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.609731 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6"} err="failed to get container status \"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\": rpc error: code = NotFound desc = could not find container \"78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6\": container with ID starting with 78fd1c6bdf08a557f0c2223c538a946709b9a225032c92ea75277140441a59c6 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.609749 4594 scope.go:117] "RemoveContainer" containerID="9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.609921 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d"} err="failed to get container status \"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\": rpc error: code = NotFound desc = could not find container \"9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d\": container with ID starting with 9db25dbee0954a026ecc2945dda40eb48c9caf78d31c44013055e9765024794d not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.609940 4594 scope.go:117] "RemoveContainer" containerID="b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.610092 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898"} err="failed to get container status \"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\": rpc error: code = NotFound desc = could not find container \"b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898\": container with ID starting with b86e524b600a831fe6b8bb71a513172f974b457d39363da90cd6152139eae898 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.610111 4594 scope.go:117] "RemoveContainer" containerID="c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.610285 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53"} err="failed to get container status \"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53\": rpc error: code = NotFound desc = could not find container \"c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53\": container with ID starting with c128df511eae5df862fc536a509d044880a30a799281e2e5abda4ed38518be53 not found: ID does not exist" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.753472 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.783577 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lp4zm"] Nov 29 05:36:27 crc kubenswrapper[4594]: I1129 05:36:27.789193 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lp4zm"] Nov 29 05:36:28 crc kubenswrapper[4594]: I1129 05:36:28.090085 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08de5891-72ca-488c-80b3-6b54c8c1a66e" path="/var/lib/kubelet/pods/08de5891-72ca-488c-80b3-6b54c8c1a66e/volumes" Nov 29 05:36:28 crc kubenswrapper[4594]: I1129 05:36:28.474269 4594 generic.go:334] "Generic (PLEG): container finished" podID="3d9055c2-3e35-408e-9426-2f471a991a3e" containerID="64ee7270cff212f1664b1df945cff889a1bcb7329e93a977a4f4741dcc5a4cce" exitCode=0 Nov 29 05:36:28 crc kubenswrapper[4594]: I1129 05:36:28.474298 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" event={"ID":"3d9055c2-3e35-408e-9426-2f471a991a3e","Type":"ContainerDied","Data":"64ee7270cff212f1664b1df945cff889a1bcb7329e93a977a4f4741dcc5a4cce"} Nov 29 05:36:28 crc kubenswrapper[4594]: I1129 05:36:28.474342 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" event={"ID":"3d9055c2-3e35-408e-9426-2f471a991a3e","Type":"ContainerStarted","Data":"f91969de677c6e9b424f731fca8f6482556abf4e59d0548bf39108f22e18dba6"} Nov 29 05:36:29 crc kubenswrapper[4594]: I1129 05:36:29.485523 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" event={"ID":"3d9055c2-3e35-408e-9426-2f471a991a3e","Type":"ContainerStarted","Data":"eee9b6ebfc802c9744242cd61785f9e9a0326f39206ee2275121bad6a58915d7"} Nov 29 05:36:29 crc kubenswrapper[4594]: I1129 05:36:29.485829 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" event={"ID":"3d9055c2-3e35-408e-9426-2f471a991a3e","Type":"ContainerStarted","Data":"1eaddbffdc6d3781992ec0df70c898d88d24748f44b969b23f41de50b9792d79"} Nov 29 05:36:29 crc kubenswrapper[4594]: I1129 05:36:29.485844 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" event={"ID":"3d9055c2-3e35-408e-9426-2f471a991a3e","Type":"ContainerStarted","Data":"ebab7f99567c02fe3da205ace490531699f27d33a5e562b18c83a1ce0806db9a"} Nov 29 05:36:29 crc kubenswrapper[4594]: I1129 05:36:29.485852 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" event={"ID":"3d9055c2-3e35-408e-9426-2f471a991a3e","Type":"ContainerStarted","Data":"252fdc4035b189eee8c9b1d6949dba1faada151560fded6702a2f6b88b732136"} Nov 29 05:36:29 crc kubenswrapper[4594]: I1129 05:36:29.485861 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" event={"ID":"3d9055c2-3e35-408e-9426-2f471a991a3e","Type":"ContainerStarted","Data":"450e3162db7a0fdec7b6d242211de59de7e65e9c8678e39843839fce159e726f"} Nov 29 05:36:29 crc kubenswrapper[4594]: I1129 05:36:29.485872 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" event={"ID":"3d9055c2-3e35-408e-9426-2f471a991a3e","Type":"ContainerStarted","Data":"a814e7f487356273cd6c4c26b5419cb0f034de47408b77bbb910a712347b79de"} Nov 29 05:36:31 crc kubenswrapper[4594]: I1129 05:36:31.496729 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" event={"ID":"3d9055c2-3e35-408e-9426-2f471a991a3e","Type":"ContainerStarted","Data":"991f5d2ac7e012fce7c9d7fb013e423fadc3d4ef4cbb03082f7a9e8fb4a62e3e"} Nov 29 05:36:33 crc kubenswrapper[4594]: I1129 05:36:33.510508 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" event={"ID":"3d9055c2-3e35-408e-9426-2f471a991a3e","Type":"ContainerStarted","Data":"d1c83c4104d136bb97833812b3087f914478e54d7730667a12d83db973374b55"} Nov 29 05:36:33 crc kubenswrapper[4594]: I1129 05:36:33.511019 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:33 crc kubenswrapper[4594]: I1129 05:36:33.511033 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:33 crc kubenswrapper[4594]: I1129 05:36:33.529853 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:33 crc kubenswrapper[4594]: I1129 05:36:33.535515 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" podStartSLOduration=6.535497231 podStartE2EDuration="6.535497231s" podCreationTimestamp="2025-11-29 05:36:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:36:33.532004501 +0000 UTC m=+517.772513721" watchObservedRunningTime="2025-11-29 05:36:33.535497231 +0000 UTC m=+517.776006451" Nov 29 05:36:34 crc kubenswrapper[4594]: I1129 05:36:34.518760 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:34 crc kubenswrapper[4594]: I1129 05:36:34.540770 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:42 crc kubenswrapper[4594]: I1129 05:36:42.083541 4594 scope.go:117] "RemoveContainer" containerID="fe1439b8aee18c36fcc41d5a5289107a458927f46cfb33947018b6cd3f7bee68" Nov 29 05:36:42 crc kubenswrapper[4594]: E1129 05:36:42.084181 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7plzz_openshift-multus(e5052790-d231-4f97-802c-c7de3cd72561)\"" pod="openshift-multus/multus-7plzz" podUID="e5052790-d231-4f97-802c-c7de3cd72561" Nov 29 05:36:47 crc kubenswrapper[4594]: I1129 05:36:47.903682 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72"] Nov 29 05:36:47 crc kubenswrapper[4594]: I1129 05:36:47.904761 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:47 crc kubenswrapper[4594]: I1129 05:36:47.906494 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 29 05:36:47 crc kubenswrapper[4594]: I1129 05:36:47.908503 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72"] Nov 29 05:36:47 crc kubenswrapper[4594]: I1129 05:36:47.970752 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b9451fa-96ca-42c8-888f-beba143e0850-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72\" (UID: \"3b9451fa-96ca-42c8-888f-beba143e0850\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:47 crc kubenswrapper[4594]: I1129 05:36:47.970787 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b9451fa-96ca-42c8-888f-beba143e0850-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72\" (UID: \"3b9451fa-96ca-42c8-888f-beba143e0850\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:47 crc kubenswrapper[4594]: I1129 05:36:47.970822 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nndj9\" (UniqueName: \"kubernetes.io/projected/3b9451fa-96ca-42c8-888f-beba143e0850-kube-api-access-nndj9\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72\" (UID: \"3b9451fa-96ca-42c8-888f-beba143e0850\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: I1129 05:36:48.071433 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b9451fa-96ca-42c8-888f-beba143e0850-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72\" (UID: \"3b9451fa-96ca-42c8-888f-beba143e0850\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: I1129 05:36:48.071475 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b9451fa-96ca-42c8-888f-beba143e0850-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72\" (UID: \"3b9451fa-96ca-42c8-888f-beba143e0850\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: I1129 05:36:48.071506 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nndj9\" (UniqueName: \"kubernetes.io/projected/3b9451fa-96ca-42c8-888f-beba143e0850-kube-api-access-nndj9\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72\" (UID: \"3b9451fa-96ca-42c8-888f-beba143e0850\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: I1129 05:36:48.071809 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b9451fa-96ca-42c8-888f-beba143e0850-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72\" (UID: \"3b9451fa-96ca-42c8-888f-beba143e0850\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: I1129 05:36:48.071953 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b9451fa-96ca-42c8-888f-beba143e0850-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72\" (UID: \"3b9451fa-96ca-42c8-888f-beba143e0850\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: I1129 05:36:48.087686 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nndj9\" (UniqueName: \"kubernetes.io/projected/3b9451fa-96ca-42c8-888f-beba143e0850-kube-api-access-nndj9\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72\" (UID: \"3b9451fa-96ca-42c8-888f-beba143e0850\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: I1129 05:36:48.218094 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: E1129 05:36:48.236647 4594 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_openshift-marketplace_3b9451fa-96ca-42c8-888f-beba143e0850_0(c32d19ba1f5046eb19419a3ef5bcaf4bf919d26b477911faf6ed617e41b27213): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 05:36:48 crc kubenswrapper[4594]: E1129 05:36:48.236700 4594 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_openshift-marketplace_3b9451fa-96ca-42c8-888f-beba143e0850_0(c32d19ba1f5046eb19419a3ef5bcaf4bf919d26b477911faf6ed617e41b27213): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: E1129 05:36:48.236719 4594 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_openshift-marketplace_3b9451fa-96ca-42c8-888f-beba143e0850_0(c32d19ba1f5046eb19419a3ef5bcaf4bf919d26b477911faf6ed617e41b27213): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: E1129 05:36:48.236765 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_openshift-marketplace(3b9451fa-96ca-42c8-888f-beba143e0850)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_openshift-marketplace(3b9451fa-96ca-42c8-888f-beba143e0850)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_openshift-marketplace_3b9451fa-96ca-42c8-888f-beba143e0850_0(c32d19ba1f5046eb19419a3ef5bcaf4bf919d26b477911faf6ed617e41b27213): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" podUID="3b9451fa-96ca-42c8-888f-beba143e0850" Nov 29 05:36:48 crc kubenswrapper[4594]: I1129 05:36:48.580742 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: I1129 05:36:48.581092 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: E1129 05:36:48.597838 4594 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_openshift-marketplace_3b9451fa-96ca-42c8-888f-beba143e0850_0(4047be1149864110b726dad2b51fd3aa9964bff95e29fef5a6a7737536d7ba5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 05:36:48 crc kubenswrapper[4594]: E1129 05:36:48.597891 4594 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_openshift-marketplace_3b9451fa-96ca-42c8-888f-beba143e0850_0(4047be1149864110b726dad2b51fd3aa9964bff95e29fef5a6a7737536d7ba5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: E1129 05:36:48.597913 4594 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_openshift-marketplace_3b9451fa-96ca-42c8-888f-beba143e0850_0(4047be1149864110b726dad2b51fd3aa9964bff95e29fef5a6a7737536d7ba5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:48 crc kubenswrapper[4594]: E1129 05:36:48.597963 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_openshift-marketplace(3b9451fa-96ca-42c8-888f-beba143e0850)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_openshift-marketplace(3b9451fa-96ca-42c8-888f-beba143e0850)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_openshift-marketplace_3b9451fa-96ca-42c8-888f-beba143e0850_0(4047be1149864110b726dad2b51fd3aa9964bff95e29fef5a6a7737536d7ba5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" podUID="3b9451fa-96ca-42c8-888f-beba143e0850" Nov 29 05:36:56 crc kubenswrapper[4594]: I1129 05:36:56.307943 4594 scope.go:117] "RemoveContainer" containerID="abc4e229bc1c5698fec5a7e0da8decb0167770496e7f8dce7a9a9276721ee26c" Nov 29 05:36:56 crc kubenswrapper[4594]: I1129 05:36:56.616480 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7plzz_e5052790-d231-4f97-802c-c7de3cd72561/kube-multus/2.log" Nov 29 05:36:57 crc kubenswrapper[4594]: I1129 05:36:57.082929 4594 scope.go:117] "RemoveContainer" containerID="fe1439b8aee18c36fcc41d5a5289107a458927f46cfb33947018b6cd3f7bee68" Nov 29 05:36:57 crc kubenswrapper[4594]: I1129 05:36:57.621820 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7plzz_e5052790-d231-4f97-802c-c7de3cd72561/kube-multus/2.log" Nov 29 05:36:57 crc kubenswrapper[4594]: I1129 05:36:57.621868 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7plzz" event={"ID":"e5052790-d231-4f97-802c-c7de3cd72561","Type":"ContainerStarted","Data":"e6027fc26813ef5883317663122b7de192608120b65a74af2389671833e77b0f"} Nov 29 05:36:57 crc kubenswrapper[4594]: I1129 05:36:57.769651 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2tw9" Nov 29 05:36:59 crc kubenswrapper[4594]: I1129 05:36:59.082515 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:59 crc kubenswrapper[4594]: I1129 05:36:59.083054 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:36:59 crc kubenswrapper[4594]: I1129 05:36:59.428830 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72"] Nov 29 05:36:59 crc kubenswrapper[4594]: W1129 05:36:59.433292 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b9451fa_96ca_42c8_888f_beba143e0850.slice/crio-3d9d90b81ea66b5e2092a87a83c210c4f9305ca882bea646f7108af5ec00cd50 WatchSource:0}: Error finding container 3d9d90b81ea66b5e2092a87a83c210c4f9305ca882bea646f7108af5ec00cd50: Status 404 returned error can't find the container with id 3d9d90b81ea66b5e2092a87a83c210c4f9305ca882bea646f7108af5ec00cd50 Nov 29 05:36:59 crc kubenswrapper[4594]: I1129 05:36:59.634556 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" event={"ID":"3b9451fa-96ca-42c8-888f-beba143e0850","Type":"ContainerStarted","Data":"3819ce12ae12d6956af77ad08ad4cc9ba448209e058532b3e8e168393304be11"} Nov 29 05:36:59 crc kubenswrapper[4594]: I1129 05:36:59.634597 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" event={"ID":"3b9451fa-96ca-42c8-888f-beba143e0850","Type":"ContainerStarted","Data":"3d9d90b81ea66b5e2092a87a83c210c4f9305ca882bea646f7108af5ec00cd50"} Nov 29 05:37:00 crc kubenswrapper[4594]: I1129 05:37:00.639537 4594 generic.go:334] "Generic (PLEG): container finished" podID="3b9451fa-96ca-42c8-888f-beba143e0850" containerID="3819ce12ae12d6956af77ad08ad4cc9ba448209e058532b3e8e168393304be11" exitCode=0 Nov 29 05:37:00 crc kubenswrapper[4594]: I1129 05:37:00.639677 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" event={"ID":"3b9451fa-96ca-42c8-888f-beba143e0850","Type":"ContainerDied","Data":"3819ce12ae12d6956af77ad08ad4cc9ba448209e058532b3e8e168393304be11"} Nov 29 05:37:02 crc kubenswrapper[4594]: I1129 05:37:02.651059 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" event={"ID":"3b9451fa-96ca-42c8-888f-beba143e0850","Type":"ContainerStarted","Data":"0a606a25bfa54c2459baff0ff4feb2741833b9e17278039cedc92f713d141ce2"} Nov 29 05:37:03 crc kubenswrapper[4594]: I1129 05:37:03.657984 4594 generic.go:334] "Generic (PLEG): container finished" podID="3b9451fa-96ca-42c8-888f-beba143e0850" containerID="0a606a25bfa54c2459baff0ff4feb2741833b9e17278039cedc92f713d141ce2" exitCode=0 Nov 29 05:37:03 crc kubenswrapper[4594]: I1129 05:37:03.658054 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" event={"ID":"3b9451fa-96ca-42c8-888f-beba143e0850","Type":"ContainerDied","Data":"0a606a25bfa54c2459baff0ff4feb2741833b9e17278039cedc92f713d141ce2"} Nov 29 05:37:04 crc kubenswrapper[4594]: I1129 05:37:04.665379 4594 generic.go:334] "Generic (PLEG): container finished" podID="3b9451fa-96ca-42c8-888f-beba143e0850" containerID="e09006a2fb9a0c0b4ea491389bf67109e22fd63400d3a7feee762458d735c952" exitCode=0 Nov 29 05:37:04 crc kubenswrapper[4594]: I1129 05:37:04.665507 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" event={"ID":"3b9451fa-96ca-42c8-888f-beba143e0850","Type":"ContainerDied","Data":"e09006a2fb9a0c0b4ea491389bf67109e22fd63400d3a7feee762458d735c952"} Nov 29 05:37:05 crc kubenswrapper[4594]: I1129 05:37:05.851459 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:37:05 crc kubenswrapper[4594]: I1129 05:37:05.954656 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nndj9\" (UniqueName: \"kubernetes.io/projected/3b9451fa-96ca-42c8-888f-beba143e0850-kube-api-access-nndj9\") pod \"3b9451fa-96ca-42c8-888f-beba143e0850\" (UID: \"3b9451fa-96ca-42c8-888f-beba143e0850\") " Nov 29 05:37:05 crc kubenswrapper[4594]: I1129 05:37:05.955002 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b9451fa-96ca-42c8-888f-beba143e0850-util\") pod \"3b9451fa-96ca-42c8-888f-beba143e0850\" (UID: \"3b9451fa-96ca-42c8-888f-beba143e0850\") " Nov 29 05:37:05 crc kubenswrapper[4594]: I1129 05:37:05.955070 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b9451fa-96ca-42c8-888f-beba143e0850-bundle\") pod \"3b9451fa-96ca-42c8-888f-beba143e0850\" (UID: \"3b9451fa-96ca-42c8-888f-beba143e0850\") " Nov 29 05:37:05 crc kubenswrapper[4594]: I1129 05:37:05.957083 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b9451fa-96ca-42c8-888f-beba143e0850-bundle" (OuterVolumeSpecName: "bundle") pod "3b9451fa-96ca-42c8-888f-beba143e0850" (UID: "3b9451fa-96ca-42c8-888f-beba143e0850"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:37:05 crc kubenswrapper[4594]: I1129 05:37:05.960079 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b9451fa-96ca-42c8-888f-beba143e0850-kube-api-access-nndj9" (OuterVolumeSpecName: "kube-api-access-nndj9") pod "3b9451fa-96ca-42c8-888f-beba143e0850" (UID: "3b9451fa-96ca-42c8-888f-beba143e0850"). InnerVolumeSpecName "kube-api-access-nndj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:37:05 crc kubenswrapper[4594]: I1129 05:37:05.962432 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b9451fa-96ca-42c8-888f-beba143e0850-util" (OuterVolumeSpecName: "util") pod "3b9451fa-96ca-42c8-888f-beba143e0850" (UID: "3b9451fa-96ca-42c8-888f-beba143e0850"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:37:06 crc kubenswrapper[4594]: I1129 05:37:06.056701 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nndj9\" (UniqueName: \"kubernetes.io/projected/3b9451fa-96ca-42c8-888f-beba143e0850-kube-api-access-nndj9\") on node \"crc\" DevicePath \"\"" Nov 29 05:37:06 crc kubenswrapper[4594]: I1129 05:37:06.056724 4594 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b9451fa-96ca-42c8-888f-beba143e0850-util\") on node \"crc\" DevicePath \"\"" Nov 29 05:37:06 crc kubenswrapper[4594]: I1129 05:37:06.056736 4594 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b9451fa-96ca-42c8-888f-beba143e0850-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:37:06 crc kubenswrapper[4594]: I1129 05:37:06.675651 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" event={"ID":"3b9451fa-96ca-42c8-888f-beba143e0850","Type":"ContainerDied","Data":"3d9d90b81ea66b5e2092a87a83c210c4f9305ca882bea646f7108af5ec00cd50"} Nov 29 05:37:06 crc kubenswrapper[4594]: I1129 05:37:06.675695 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d9d90b81ea66b5e2092a87a83c210c4f9305ca882bea646f7108af5ec00cd50" Nov 29 05:37:06 crc kubenswrapper[4594]: I1129 05:37:06.675717 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.069016 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-lj6hq"] Nov 29 05:37:15 crc kubenswrapper[4594]: E1129 05:37:15.069695 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9451fa-96ca-42c8-888f-beba143e0850" containerName="util" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.069708 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9451fa-96ca-42c8-888f-beba143e0850" containerName="util" Nov 29 05:37:15 crc kubenswrapper[4594]: E1129 05:37:15.069729 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9451fa-96ca-42c8-888f-beba143e0850" containerName="pull" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.069734 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9451fa-96ca-42c8-888f-beba143e0850" containerName="pull" Nov 29 05:37:15 crc kubenswrapper[4594]: E1129 05:37:15.069740 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9451fa-96ca-42c8-888f-beba143e0850" containerName="extract" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.069745 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9451fa-96ca-42c8-888f-beba143e0850" containerName="extract" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.069848 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b9451fa-96ca-42c8-888f-beba143e0850" containerName="extract" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.070162 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lj6hq" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.071979 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.072072 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-7bgns" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.075571 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.079489 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-lj6hq"] Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.154702 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb2r5\" (UniqueName: \"kubernetes.io/projected/862c89f5-7442-4e20-8677-ef780f71545d-kube-api-access-zb2r5\") pod \"obo-prometheus-operator-668cf9dfbb-lj6hq\" (UID: \"862c89f5-7442-4e20-8677-ef780f71545d\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lj6hq" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.208284 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q"] Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.230802 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv"] Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.232434 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.232918 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.240516 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q"] Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.242112 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-knpjm" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.242811 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.247590 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv"] Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.255779 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0283828f-9a3f-4c00-8409-a49231f3b953-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q\" (UID: \"0283828f-9a3f-4c00-8409-a49231f3b953\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.255816 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0283828f-9a3f-4c00-8409-a49231f3b953-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q\" (UID: \"0283828f-9a3f-4c00-8409-a49231f3b953\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.255901 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb2r5\" (UniqueName: \"kubernetes.io/projected/862c89f5-7442-4e20-8677-ef780f71545d-kube-api-access-zb2r5\") pod \"obo-prometheus-operator-668cf9dfbb-lj6hq\" (UID: \"862c89f5-7442-4e20-8677-ef780f71545d\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lj6hq" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.255935 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ec2d68d-f88d-411c-9790-4fc800a02905-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv\" (UID: \"8ec2d68d-f88d-411c-9790-4fc800a02905\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.255971 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ec2d68d-f88d-411c-9790-4fc800a02905-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv\" (UID: \"8ec2d68d-f88d-411c-9790-4fc800a02905\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.271797 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb2r5\" (UniqueName: \"kubernetes.io/projected/862c89f5-7442-4e20-8677-ef780f71545d-kube-api-access-zb2r5\") pod \"obo-prometheus-operator-668cf9dfbb-lj6hq\" (UID: \"862c89f5-7442-4e20-8677-ef780f71545d\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lj6hq" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.356857 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0283828f-9a3f-4c00-8409-a49231f3b953-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q\" (UID: \"0283828f-9a3f-4c00-8409-a49231f3b953\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.357135 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0283828f-9a3f-4c00-8409-a49231f3b953-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q\" (UID: \"0283828f-9a3f-4c00-8409-a49231f3b953\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.357332 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ec2d68d-f88d-411c-9790-4fc800a02905-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv\" (UID: \"8ec2d68d-f88d-411c-9790-4fc800a02905\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.357845 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ec2d68d-f88d-411c-9790-4fc800a02905-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv\" (UID: \"8ec2d68d-f88d-411c-9790-4fc800a02905\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.360021 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ec2d68d-f88d-411c-9790-4fc800a02905-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv\" (UID: \"8ec2d68d-f88d-411c-9790-4fc800a02905\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.360024 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0283828f-9a3f-4c00-8409-a49231f3b953-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q\" (UID: \"0283828f-9a3f-4c00-8409-a49231f3b953\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.360199 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0283828f-9a3f-4c00-8409-a49231f3b953-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q\" (UID: \"0283828f-9a3f-4c00-8409-a49231f3b953\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.360587 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ec2d68d-f88d-411c-9790-4fc800a02905-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv\" (UID: \"8ec2d68d-f88d-411c-9790-4fc800a02905\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.390607 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lj6hq" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.402783 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-l25w2"] Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.403400 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-l25w2" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.404726 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-wgjwg" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.404787 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.414379 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-l25w2"] Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.459840 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/57d47f4c-b8cb-4c20-9adb-2e9190e48f82-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-l25w2\" (UID: \"57d47f4c-b8cb-4c20-9adb-2e9190e48f82\") " pod="openshift-operators/observability-operator-d8bb48f5d-l25w2" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.460020 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hx52\" (UniqueName: \"kubernetes.io/projected/57d47f4c-b8cb-4c20-9adb-2e9190e48f82-kube-api-access-5hx52\") pod \"observability-operator-d8bb48f5d-l25w2\" (UID: \"57d47f4c-b8cb-4c20-9adb-2e9190e48f82\") " pod="openshift-operators/observability-operator-d8bb48f5d-l25w2" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.509466 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-rt8w5"] Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.510380 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-rt8w5" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.512416 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-97xhc" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.519990 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-rt8w5"] Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.561647 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ad950138-e417-48eb-a3e9-a5c575d4507f-openshift-service-ca\") pod \"perses-operator-5446b9c989-rt8w5\" (UID: \"ad950138-e417-48eb-a3e9-a5c575d4507f\") " pod="openshift-operators/perses-operator-5446b9c989-rt8w5" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.561751 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hx52\" (UniqueName: \"kubernetes.io/projected/57d47f4c-b8cb-4c20-9adb-2e9190e48f82-kube-api-access-5hx52\") pod \"observability-operator-d8bb48f5d-l25w2\" (UID: \"57d47f4c-b8cb-4c20-9adb-2e9190e48f82\") " pod="openshift-operators/observability-operator-d8bb48f5d-l25w2" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.561845 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/57d47f4c-b8cb-4c20-9adb-2e9190e48f82-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-l25w2\" (UID: \"57d47f4c-b8cb-4c20-9adb-2e9190e48f82\") " pod="openshift-operators/observability-operator-d8bb48f5d-l25w2" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.561882 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4jgf\" (UniqueName: \"kubernetes.io/projected/ad950138-e417-48eb-a3e9-a5c575d4507f-kube-api-access-r4jgf\") pod \"perses-operator-5446b9c989-rt8w5\" (UID: \"ad950138-e417-48eb-a3e9-a5c575d4507f\") " pod="openshift-operators/perses-operator-5446b9c989-rt8w5" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.565577 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.574577 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/57d47f4c-b8cb-4c20-9adb-2e9190e48f82-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-l25w2\" (UID: \"57d47f4c-b8cb-4c20-9adb-2e9190e48f82\") " pod="openshift-operators/observability-operator-d8bb48f5d-l25w2" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.574720 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.601494 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hx52\" (UniqueName: \"kubernetes.io/projected/57d47f4c-b8cb-4c20-9adb-2e9190e48f82-kube-api-access-5hx52\") pod \"observability-operator-d8bb48f5d-l25w2\" (UID: \"57d47f4c-b8cb-4c20-9adb-2e9190e48f82\") " pod="openshift-operators/observability-operator-d8bb48f5d-l25w2" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.658083 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-lj6hq"] Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.663208 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ad950138-e417-48eb-a3e9-a5c575d4507f-openshift-service-ca\") pod \"perses-operator-5446b9c989-rt8w5\" (UID: \"ad950138-e417-48eb-a3e9-a5c575d4507f\") " pod="openshift-operators/perses-operator-5446b9c989-rt8w5" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.663476 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4jgf\" (UniqueName: \"kubernetes.io/projected/ad950138-e417-48eb-a3e9-a5c575d4507f-kube-api-access-r4jgf\") pod \"perses-operator-5446b9c989-rt8w5\" (UID: \"ad950138-e417-48eb-a3e9-a5c575d4507f\") " pod="openshift-operators/perses-operator-5446b9c989-rt8w5" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.664536 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ad950138-e417-48eb-a3e9-a5c575d4507f-openshift-service-ca\") pod \"perses-operator-5446b9c989-rt8w5\" (UID: \"ad950138-e417-48eb-a3e9-a5c575d4507f\") " pod="openshift-operators/perses-operator-5446b9c989-rt8w5" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.688064 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4jgf\" (UniqueName: \"kubernetes.io/projected/ad950138-e417-48eb-a3e9-a5c575d4507f-kube-api-access-r4jgf\") pod \"perses-operator-5446b9c989-rt8w5\" (UID: \"ad950138-e417-48eb-a3e9-a5c575d4507f\") " pod="openshift-operators/perses-operator-5446b9c989-rt8w5" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.722503 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lj6hq" event={"ID":"862c89f5-7442-4e20-8677-ef780f71545d","Type":"ContainerStarted","Data":"66e38b8c56644855ed2db092f97271ad71fdc5c5546e1b7143047717975434f8"} Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.740592 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-l25w2" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.841099 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-rt8w5" Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.846836 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q"] Nov 29 05:37:15 crc kubenswrapper[4594]: I1129 05:37:15.969538 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-l25w2"] Nov 29 05:37:15 crc kubenswrapper[4594]: W1129 05:37:15.988355 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57d47f4c_b8cb_4c20_9adb_2e9190e48f82.slice/crio-f59b7e465f0f5ba5ce10341d871c79ad94c326533bae02cd288e79a078f792c2 WatchSource:0}: Error finding container f59b7e465f0f5ba5ce10341d871c79ad94c326533bae02cd288e79a078f792c2: Status 404 returned error can't find the container with id f59b7e465f0f5ba5ce10341d871c79ad94c326533bae02cd288e79a078f792c2 Nov 29 05:37:16 crc kubenswrapper[4594]: I1129 05:37:16.101897 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv"] Nov 29 05:37:16 crc kubenswrapper[4594]: W1129 05:37:16.105089 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ec2d68d_f88d_411c_9790_4fc800a02905.slice/crio-d14b7f80c7d22919d7ebdb0c07272b0a2ebc4ec69b49823fcd4a74d447318443 WatchSource:0}: Error finding container d14b7f80c7d22919d7ebdb0c07272b0a2ebc4ec69b49823fcd4a74d447318443: Status 404 returned error can't find the container with id d14b7f80c7d22919d7ebdb0c07272b0a2ebc4ec69b49823fcd4a74d447318443 Nov 29 05:37:16 crc kubenswrapper[4594]: I1129 05:37:16.113298 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-rt8w5"] Nov 29 05:37:16 crc kubenswrapper[4594]: W1129 05:37:16.118355 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad950138_e417_48eb_a3e9_a5c575d4507f.slice/crio-85fdd3180cc179f81d094f239c2c2d0a5d49314b9895b8eda01c35722fa9005b WatchSource:0}: Error finding container 85fdd3180cc179f81d094f239c2c2d0a5d49314b9895b8eda01c35722fa9005b: Status 404 returned error can't find the container with id 85fdd3180cc179f81d094f239c2c2d0a5d49314b9895b8eda01c35722fa9005b Nov 29 05:37:16 crc kubenswrapper[4594]: I1129 05:37:16.729653 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q" event={"ID":"0283828f-9a3f-4c00-8409-a49231f3b953","Type":"ContainerStarted","Data":"c712a90feea286c99e83dbb88b94557d6e3cb010b07ac88992b20ffa79ba1282"} Nov 29 05:37:16 crc kubenswrapper[4594]: I1129 05:37:16.730941 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv" event={"ID":"8ec2d68d-f88d-411c-9790-4fc800a02905","Type":"ContainerStarted","Data":"d14b7f80c7d22919d7ebdb0c07272b0a2ebc4ec69b49823fcd4a74d447318443"} Nov 29 05:37:16 crc kubenswrapper[4594]: I1129 05:37:16.732077 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-rt8w5" event={"ID":"ad950138-e417-48eb-a3e9-a5c575d4507f","Type":"ContainerStarted","Data":"85fdd3180cc179f81d094f239c2c2d0a5d49314b9895b8eda01c35722fa9005b"} Nov 29 05:37:16 crc kubenswrapper[4594]: I1129 05:37:16.733018 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-l25w2" event={"ID":"57d47f4c-b8cb-4c20-9adb-2e9190e48f82","Type":"ContainerStarted","Data":"f59b7e465f0f5ba5ce10341d871c79ad94c326533bae02cd288e79a078f792c2"} Nov 29 05:37:28 crc kubenswrapper[4594]: I1129 05:37:28.822885 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv" event={"ID":"8ec2d68d-f88d-411c-9790-4fc800a02905","Type":"ContainerStarted","Data":"83ce19b4f8a38230c79b593ae41c59d63a2cd689d32a2e7986714e613abb4780"} Nov 29 05:37:28 crc kubenswrapper[4594]: I1129 05:37:28.826002 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-l25w2" event={"ID":"57d47f4c-b8cb-4c20-9adb-2e9190e48f82","Type":"ContainerStarted","Data":"d45a39818b964e66129bf597145419eec48a10f56ed49dbb016923c1a1d917e1"} Nov 29 05:37:28 crc kubenswrapper[4594]: I1129 05:37:28.826311 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-l25w2" Nov 29 05:37:28 crc kubenswrapper[4594]: I1129 05:37:28.828597 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-rt8w5" event={"ID":"ad950138-e417-48eb-a3e9-a5c575d4507f","Type":"ContainerStarted","Data":"12ddc05aaac7df91e1c59fb4c27ecd9e4e7329f69f66239722ffa5d54a61fce0"} Nov 29 05:37:28 crc kubenswrapper[4594]: I1129 05:37:28.828662 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-l25w2" Nov 29 05:37:28 crc kubenswrapper[4594]: I1129 05:37:28.828726 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-rt8w5" Nov 29 05:37:28 crc kubenswrapper[4594]: I1129 05:37:28.830195 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lj6hq" event={"ID":"862c89f5-7442-4e20-8677-ef780f71545d","Type":"ContainerStarted","Data":"7790e65737af1cc288adc3de61ba76bb5c97765aa81b2414609d5f9e3200f5bb"} Nov 29 05:37:28 crc kubenswrapper[4594]: I1129 05:37:28.832293 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q" event={"ID":"0283828f-9a3f-4c00-8409-a49231f3b953","Type":"ContainerStarted","Data":"7af450e5695c201a85f7d815c9877e378663ce02543325df1baec45602ae1296"} Nov 29 05:37:28 crc kubenswrapper[4594]: I1129 05:37:28.840397 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv" podStartSLOduration=1.913341811 podStartE2EDuration="13.840369274s" podCreationTimestamp="2025-11-29 05:37:15 +0000 UTC" firstStartedPulling="2025-11-29 05:37:16.10945302 +0000 UTC m=+560.349962240" lastFinishedPulling="2025-11-29 05:37:28.036480482 +0000 UTC m=+572.276989703" observedRunningTime="2025-11-29 05:37:28.836922523 +0000 UTC m=+573.077431744" watchObservedRunningTime="2025-11-29 05:37:28.840369274 +0000 UTC m=+573.080878494" Nov 29 05:37:28 crc kubenswrapper[4594]: I1129 05:37:28.858316 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q" podStartSLOduration=1.7227058290000001 podStartE2EDuration="13.858302713s" podCreationTimestamp="2025-11-29 05:37:15 +0000 UTC" firstStartedPulling="2025-11-29 05:37:15.879156016 +0000 UTC m=+560.119665236" lastFinishedPulling="2025-11-29 05:37:28.0147529 +0000 UTC m=+572.255262120" observedRunningTime="2025-11-29 05:37:28.856245275 +0000 UTC m=+573.096754494" watchObservedRunningTime="2025-11-29 05:37:28.858302713 +0000 UTC m=+573.098811933" Nov 29 05:37:28 crc kubenswrapper[4594]: I1129 05:37:28.873449 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lj6hq" podStartSLOduration=1.51871836 podStartE2EDuration="13.873431217s" podCreationTimestamp="2025-11-29 05:37:15 +0000 UTC" firstStartedPulling="2025-11-29 05:37:15.684278507 +0000 UTC m=+559.924787727" lastFinishedPulling="2025-11-29 05:37:28.038991364 +0000 UTC m=+572.279500584" observedRunningTime="2025-11-29 05:37:28.86943312 +0000 UTC m=+573.109942340" watchObservedRunningTime="2025-11-29 05:37:28.873431217 +0000 UTC m=+573.113940437" Nov 29 05:37:28 crc kubenswrapper[4594]: I1129 05:37:28.906621 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-l25w2" podStartSLOduration=1.810083127 podStartE2EDuration="13.90660438s" podCreationTimestamp="2025-11-29 05:37:15 +0000 UTC" firstStartedPulling="2025-11-29 05:37:15.991940593 +0000 UTC m=+560.232449814" lastFinishedPulling="2025-11-29 05:37:28.088461848 +0000 UTC m=+572.328971067" observedRunningTime="2025-11-29 05:37:28.906288215 +0000 UTC m=+573.146797435" watchObservedRunningTime="2025-11-29 05:37:28.90660438 +0000 UTC m=+573.147113600" Nov 29 05:37:28 crc kubenswrapper[4594]: I1129 05:37:28.908607 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-rt8w5" podStartSLOduration=1.9685007410000002 podStartE2EDuration="13.908602055s" podCreationTimestamp="2025-11-29 05:37:15 +0000 UTC" firstStartedPulling="2025-11-29 05:37:16.120595911 +0000 UTC m=+560.361105131" lastFinishedPulling="2025-11-29 05:37:28.060697225 +0000 UTC m=+572.301206445" observedRunningTime="2025-11-29 05:37:28.891006642 +0000 UTC m=+573.131515863" watchObservedRunningTime="2025-11-29 05:37:28.908602055 +0000 UTC m=+573.149111276" Nov 29 05:37:35 crc kubenswrapper[4594]: I1129 05:37:35.849588 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-rt8w5" Nov 29 05:37:45 crc kubenswrapper[4594]: I1129 05:37:45.800942 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:37:45 crc kubenswrapper[4594]: I1129 05:37:45.801542 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.259479 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4"] Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.261284 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.263138 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.268855 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4"] Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.407705 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/837b6915-2ae6-4f64-af4f-029c8d1012d3-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4\" (UID: \"837b6915-2ae6-4f64-af4f-029c8d1012d3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.407820 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/837b6915-2ae6-4f64-af4f-029c8d1012d3-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4\" (UID: \"837b6915-2ae6-4f64-af4f-029c8d1012d3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.407864 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stxj6\" (UniqueName: \"kubernetes.io/projected/837b6915-2ae6-4f64-af4f-029c8d1012d3-kube-api-access-stxj6\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4\" (UID: \"837b6915-2ae6-4f64-af4f-029c8d1012d3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.508768 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/837b6915-2ae6-4f64-af4f-029c8d1012d3-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4\" (UID: \"837b6915-2ae6-4f64-af4f-029c8d1012d3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.508822 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stxj6\" (UniqueName: \"kubernetes.io/projected/837b6915-2ae6-4f64-af4f-029c8d1012d3-kube-api-access-stxj6\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4\" (UID: \"837b6915-2ae6-4f64-af4f-029c8d1012d3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.508869 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/837b6915-2ae6-4f64-af4f-029c8d1012d3-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4\" (UID: \"837b6915-2ae6-4f64-af4f-029c8d1012d3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.509205 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/837b6915-2ae6-4f64-af4f-029c8d1012d3-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4\" (UID: \"837b6915-2ae6-4f64-af4f-029c8d1012d3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.509250 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/837b6915-2ae6-4f64-af4f-029c8d1012d3-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4\" (UID: \"837b6915-2ae6-4f64-af4f-029c8d1012d3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.524959 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stxj6\" (UniqueName: \"kubernetes.io/projected/837b6915-2ae6-4f64-af4f-029c8d1012d3-kube-api-access-stxj6\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4\" (UID: \"837b6915-2ae6-4f64-af4f-029c8d1012d3\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.575708 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.919892 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4"] Nov 29 05:37:49 crc kubenswrapper[4594]: W1129 05:37:49.924962 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod837b6915_2ae6_4f64_af4f_029c8d1012d3.slice/crio-d8be4b4d56fd353ebeebd28544542386c1688ba8cdb6f24439c9d4b7854c219f WatchSource:0}: Error finding container d8be4b4d56fd353ebeebd28544542386c1688ba8cdb6f24439c9d4b7854c219f: Status 404 returned error can't find the container with id d8be4b4d56fd353ebeebd28544542386c1688ba8cdb6f24439c9d4b7854c219f Nov 29 05:37:49 crc kubenswrapper[4594]: I1129 05:37:49.937128 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" event={"ID":"837b6915-2ae6-4f64-af4f-029c8d1012d3","Type":"ContainerStarted","Data":"d8be4b4d56fd353ebeebd28544542386c1688ba8cdb6f24439c9d4b7854c219f"} Nov 29 05:37:50 crc kubenswrapper[4594]: I1129 05:37:50.947660 4594 generic.go:334] "Generic (PLEG): container finished" podID="837b6915-2ae6-4f64-af4f-029c8d1012d3" containerID="e902d2bb0e9adf7888dd38256a0b51eb4cf6313542c338258fc89d5aa812a003" exitCode=0 Nov 29 05:37:50 crc kubenswrapper[4594]: I1129 05:37:50.947701 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" event={"ID":"837b6915-2ae6-4f64-af4f-029c8d1012d3","Type":"ContainerDied","Data":"e902d2bb0e9adf7888dd38256a0b51eb4cf6313542c338258fc89d5aa812a003"} Nov 29 05:37:52 crc kubenswrapper[4594]: I1129 05:37:52.959040 4594 generic.go:334] "Generic (PLEG): container finished" podID="837b6915-2ae6-4f64-af4f-029c8d1012d3" containerID="3bba7de8d62091ca2986fbb8b11f566d5f7449fd97a57f2ddcc9e3c857581e32" exitCode=0 Nov 29 05:37:52 crc kubenswrapper[4594]: I1129 05:37:52.959134 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" event={"ID":"837b6915-2ae6-4f64-af4f-029c8d1012d3","Type":"ContainerDied","Data":"3bba7de8d62091ca2986fbb8b11f566d5f7449fd97a57f2ddcc9e3c857581e32"} Nov 29 05:37:53 crc kubenswrapper[4594]: I1129 05:37:53.967407 4594 generic.go:334] "Generic (PLEG): container finished" podID="837b6915-2ae6-4f64-af4f-029c8d1012d3" containerID="283bc130b3bef1f1ddc4ff721bf9b910f28064ed0d38b9ae02073144d7c0f87b" exitCode=0 Nov 29 05:37:53 crc kubenswrapper[4594]: I1129 05:37:53.967487 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" event={"ID":"837b6915-2ae6-4f64-af4f-029c8d1012d3","Type":"ContainerDied","Data":"283bc130b3bef1f1ddc4ff721bf9b910f28064ed0d38b9ae02073144d7c0f87b"} Nov 29 05:37:55 crc kubenswrapper[4594]: I1129 05:37:55.154496 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" Nov 29 05:37:55 crc kubenswrapper[4594]: I1129 05:37:55.276528 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/837b6915-2ae6-4f64-af4f-029c8d1012d3-util\") pod \"837b6915-2ae6-4f64-af4f-029c8d1012d3\" (UID: \"837b6915-2ae6-4f64-af4f-029c8d1012d3\") " Nov 29 05:37:55 crc kubenswrapper[4594]: I1129 05:37:55.277051 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stxj6\" (UniqueName: \"kubernetes.io/projected/837b6915-2ae6-4f64-af4f-029c8d1012d3-kube-api-access-stxj6\") pod \"837b6915-2ae6-4f64-af4f-029c8d1012d3\" (UID: \"837b6915-2ae6-4f64-af4f-029c8d1012d3\") " Nov 29 05:37:55 crc kubenswrapper[4594]: I1129 05:37:55.277098 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/837b6915-2ae6-4f64-af4f-029c8d1012d3-bundle\") pod \"837b6915-2ae6-4f64-af4f-029c8d1012d3\" (UID: \"837b6915-2ae6-4f64-af4f-029c8d1012d3\") " Nov 29 05:37:55 crc kubenswrapper[4594]: I1129 05:37:55.277854 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837b6915-2ae6-4f64-af4f-029c8d1012d3-bundle" (OuterVolumeSpecName: "bundle") pod "837b6915-2ae6-4f64-af4f-029c8d1012d3" (UID: "837b6915-2ae6-4f64-af4f-029c8d1012d3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:37:55 crc kubenswrapper[4594]: I1129 05:37:55.283162 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837b6915-2ae6-4f64-af4f-029c8d1012d3-kube-api-access-stxj6" (OuterVolumeSpecName: "kube-api-access-stxj6") pod "837b6915-2ae6-4f64-af4f-029c8d1012d3" (UID: "837b6915-2ae6-4f64-af4f-029c8d1012d3"). InnerVolumeSpecName "kube-api-access-stxj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:37:55 crc kubenswrapper[4594]: I1129 05:37:55.287116 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837b6915-2ae6-4f64-af4f-029c8d1012d3-util" (OuterVolumeSpecName: "util") pod "837b6915-2ae6-4f64-af4f-029c8d1012d3" (UID: "837b6915-2ae6-4f64-af4f-029c8d1012d3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:37:55 crc kubenswrapper[4594]: I1129 05:37:55.378381 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stxj6\" (UniqueName: \"kubernetes.io/projected/837b6915-2ae6-4f64-af4f-029c8d1012d3-kube-api-access-stxj6\") on node \"crc\" DevicePath \"\"" Nov 29 05:37:55 crc kubenswrapper[4594]: I1129 05:37:55.378503 4594 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/837b6915-2ae6-4f64-af4f-029c8d1012d3-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:37:55 crc kubenswrapper[4594]: I1129 05:37:55.378573 4594 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/837b6915-2ae6-4f64-af4f-029c8d1012d3-util\") on node \"crc\" DevicePath \"\"" Nov 29 05:37:55 crc kubenswrapper[4594]: I1129 05:37:55.982439 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" event={"ID":"837b6915-2ae6-4f64-af4f-029c8d1012d3","Type":"ContainerDied","Data":"d8be4b4d56fd353ebeebd28544542386c1688ba8cdb6f24439c9d4b7854c219f"} Nov 29 05:37:55 crc kubenswrapper[4594]: I1129 05:37:55.982486 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8be4b4d56fd353ebeebd28544542386c1688ba8cdb6f24439c9d4b7854c219f" Nov 29 05:37:55 crc kubenswrapper[4594]: I1129 05:37:55.982511 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4" Nov 29 05:38:00 crc kubenswrapper[4594]: I1129 05:38:00.891097 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-gp4lk"] Nov 29 05:38:00 crc kubenswrapper[4594]: E1129 05:38:00.892577 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837b6915-2ae6-4f64-af4f-029c8d1012d3" containerName="pull" Nov 29 05:38:00 crc kubenswrapper[4594]: I1129 05:38:00.892644 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="837b6915-2ae6-4f64-af4f-029c8d1012d3" containerName="pull" Nov 29 05:38:00 crc kubenswrapper[4594]: E1129 05:38:00.892707 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837b6915-2ae6-4f64-af4f-029c8d1012d3" containerName="extract" Nov 29 05:38:00 crc kubenswrapper[4594]: I1129 05:38:00.892754 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="837b6915-2ae6-4f64-af4f-029c8d1012d3" containerName="extract" Nov 29 05:38:00 crc kubenswrapper[4594]: E1129 05:38:00.892809 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837b6915-2ae6-4f64-af4f-029c8d1012d3" containerName="util" Nov 29 05:38:00 crc kubenswrapper[4594]: I1129 05:38:00.892856 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="837b6915-2ae6-4f64-af4f-029c8d1012d3" containerName="util" Nov 29 05:38:00 crc kubenswrapper[4594]: I1129 05:38:00.893000 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="837b6915-2ae6-4f64-af4f-029c8d1012d3" containerName="extract" Nov 29 05:38:00 crc kubenswrapper[4594]: I1129 05:38:00.893494 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gp4lk" Nov 29 05:38:00 crc kubenswrapper[4594]: I1129 05:38:00.895480 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 29 05:38:00 crc kubenswrapper[4594]: I1129 05:38:00.895495 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-t7bwm" Nov 29 05:38:00 crc kubenswrapper[4594]: I1129 05:38:00.896662 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 29 05:38:00 crc kubenswrapper[4594]: I1129 05:38:00.901272 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-gp4lk"] Nov 29 05:38:01 crc kubenswrapper[4594]: I1129 05:38:01.046176 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddml8\" (UniqueName: \"kubernetes.io/projected/ae0a89d0-92c1-4884-a38f-34cf97da3de5-kube-api-access-ddml8\") pod \"nmstate-operator-5b5b58f5c8-gp4lk\" (UID: \"ae0a89d0-92c1-4884-a38f-34cf97da3de5\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gp4lk" Nov 29 05:38:01 crc kubenswrapper[4594]: I1129 05:38:01.148101 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddml8\" (UniqueName: \"kubernetes.io/projected/ae0a89d0-92c1-4884-a38f-34cf97da3de5-kube-api-access-ddml8\") pod \"nmstate-operator-5b5b58f5c8-gp4lk\" (UID: \"ae0a89d0-92c1-4884-a38f-34cf97da3de5\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gp4lk" Nov 29 05:38:01 crc kubenswrapper[4594]: I1129 05:38:01.166164 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddml8\" (UniqueName: \"kubernetes.io/projected/ae0a89d0-92c1-4884-a38f-34cf97da3de5-kube-api-access-ddml8\") pod \"nmstate-operator-5b5b58f5c8-gp4lk\" (UID: \"ae0a89d0-92c1-4884-a38f-34cf97da3de5\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gp4lk" Nov 29 05:38:01 crc kubenswrapper[4594]: I1129 05:38:01.206964 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gp4lk" Nov 29 05:38:01 crc kubenswrapper[4594]: I1129 05:38:01.411585 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-gp4lk"] Nov 29 05:38:02 crc kubenswrapper[4594]: I1129 05:38:02.041477 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gp4lk" event={"ID":"ae0a89d0-92c1-4884-a38f-34cf97da3de5","Type":"ContainerStarted","Data":"bd5e34a03926a77e826a86ed311f3297c6fb4784d1dca15202eb59946b5083bf"} Nov 29 05:38:04 crc kubenswrapper[4594]: I1129 05:38:04.057762 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gp4lk" event={"ID":"ae0a89d0-92c1-4884-a38f-34cf97da3de5","Type":"ContainerStarted","Data":"d507429be0f223229495f4b926dd22a9d5a64699d7accee94953ca2a0074abc7"} Nov 29 05:38:04 crc kubenswrapper[4594]: I1129 05:38:04.072269 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gp4lk" podStartSLOduration=1.702736064 podStartE2EDuration="4.07223762s" podCreationTimestamp="2025-11-29 05:38:00 +0000 UTC" firstStartedPulling="2025-11-29 05:38:01.420741647 +0000 UTC m=+605.661250867" lastFinishedPulling="2025-11-29 05:38:03.790243203 +0000 UTC m=+608.030752423" observedRunningTime="2025-11-29 05:38:04.071127473 +0000 UTC m=+608.311636703" watchObservedRunningTime="2025-11-29 05:38:04.07223762 +0000 UTC m=+608.312746830" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.095064 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-lp45r"] Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.096645 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lp45r" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.098654 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-hczww" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.107588 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km"] Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.108377 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.110784 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.128584 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-lp45r"] Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.132411 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-szl2t"] Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.133132 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.141006 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km"] Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.163613 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmdwc\" (UniqueName: \"kubernetes.io/projected/75ccbb22-479b-4415-aa0d-c00853a463ee-kube-api-access-pmdwc\") pod \"nmstate-webhook-5f6d4c5ccb-7g6km\" (UID: \"75ccbb22-479b-4415-aa0d-c00853a463ee\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.163653 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/92357f02-ea29-48b6-b763-f6e1b8ca3457-ovs-socket\") pod \"nmstate-handler-szl2t\" (UID: \"92357f02-ea29-48b6-b763-f6e1b8ca3457\") " pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.163936 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dvg7\" (UniqueName: \"kubernetes.io/projected/92357f02-ea29-48b6-b763-f6e1b8ca3457-kube-api-access-7dvg7\") pod \"nmstate-handler-szl2t\" (UID: \"92357f02-ea29-48b6-b763-f6e1b8ca3457\") " pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.164038 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/75ccbb22-479b-4415-aa0d-c00853a463ee-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7g6km\" (UID: \"75ccbb22-479b-4415-aa0d-c00853a463ee\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.164155 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/92357f02-ea29-48b6-b763-f6e1b8ca3457-dbus-socket\") pod \"nmstate-handler-szl2t\" (UID: \"92357f02-ea29-48b6-b763-f6e1b8ca3457\") " pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.164340 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm6ss\" (UniqueName: \"kubernetes.io/projected/4eb3a18a-b943-4996-8977-8c442eca7e9e-kube-api-access-bm6ss\") pod \"nmstate-metrics-7f946cbc9-lp45r\" (UID: \"4eb3a18a-b943-4996-8977-8c442eca7e9e\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lp45r" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.164466 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/92357f02-ea29-48b6-b763-f6e1b8ca3457-nmstate-lock\") pod \"nmstate-handler-szl2t\" (UID: \"92357f02-ea29-48b6-b763-f6e1b8ca3457\") " pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.193949 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s"] Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.194593 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.198221 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-9gwc8" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.198410 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.198539 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.202849 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s"] Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.265138 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp26n\" (UniqueName: \"kubernetes.io/projected/060d894f-3918-4f8c-8b70-33d7e18b316d-kube-api-access-vp26n\") pod \"nmstate-console-plugin-7fbb5f6569-kj52s\" (UID: \"060d894f-3918-4f8c-8b70-33d7e18b316d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.265216 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/75ccbb22-479b-4415-aa0d-c00853a463ee-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7g6km\" (UID: \"75ccbb22-479b-4415-aa0d-c00853a463ee\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.265238 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dvg7\" (UniqueName: \"kubernetes.io/projected/92357f02-ea29-48b6-b763-f6e1b8ca3457-kube-api-access-7dvg7\") pod \"nmstate-handler-szl2t\" (UID: \"92357f02-ea29-48b6-b763-f6e1b8ca3457\") " pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.265272 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/92357f02-ea29-48b6-b763-f6e1b8ca3457-dbus-socket\") pod \"nmstate-handler-szl2t\" (UID: \"92357f02-ea29-48b6-b763-f6e1b8ca3457\") " pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.265301 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/060d894f-3918-4f8c-8b70-33d7e18b316d-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kj52s\" (UID: \"060d894f-3918-4f8c-8b70-33d7e18b316d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.265346 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm6ss\" (UniqueName: \"kubernetes.io/projected/4eb3a18a-b943-4996-8977-8c442eca7e9e-kube-api-access-bm6ss\") pod \"nmstate-metrics-7f946cbc9-lp45r\" (UID: \"4eb3a18a-b943-4996-8977-8c442eca7e9e\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lp45r" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.265369 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/060d894f-3918-4f8c-8b70-33d7e18b316d-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-kj52s\" (UID: \"060d894f-3918-4f8c-8b70-33d7e18b316d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.265394 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/92357f02-ea29-48b6-b763-f6e1b8ca3457-nmstate-lock\") pod \"nmstate-handler-szl2t\" (UID: \"92357f02-ea29-48b6-b763-f6e1b8ca3457\") " pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.265458 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmdwc\" (UniqueName: \"kubernetes.io/projected/75ccbb22-479b-4415-aa0d-c00853a463ee-kube-api-access-pmdwc\") pod \"nmstate-webhook-5f6d4c5ccb-7g6km\" (UID: \"75ccbb22-479b-4415-aa0d-c00853a463ee\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.265477 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/92357f02-ea29-48b6-b763-f6e1b8ca3457-ovs-socket\") pod \"nmstate-handler-szl2t\" (UID: \"92357f02-ea29-48b6-b763-f6e1b8ca3457\") " pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.265531 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/92357f02-ea29-48b6-b763-f6e1b8ca3457-ovs-socket\") pod \"nmstate-handler-szl2t\" (UID: \"92357f02-ea29-48b6-b763-f6e1b8ca3457\") " pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.265569 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/92357f02-ea29-48b6-b763-f6e1b8ca3457-dbus-socket\") pod \"nmstate-handler-szl2t\" (UID: \"92357f02-ea29-48b6-b763-f6e1b8ca3457\") " pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.265606 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/92357f02-ea29-48b6-b763-f6e1b8ca3457-nmstate-lock\") pod \"nmstate-handler-szl2t\" (UID: \"92357f02-ea29-48b6-b763-f6e1b8ca3457\") " pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.270996 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/75ccbb22-479b-4415-aa0d-c00853a463ee-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7g6km\" (UID: \"75ccbb22-479b-4415-aa0d-c00853a463ee\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.280699 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm6ss\" (UniqueName: \"kubernetes.io/projected/4eb3a18a-b943-4996-8977-8c442eca7e9e-kube-api-access-bm6ss\") pod \"nmstate-metrics-7f946cbc9-lp45r\" (UID: \"4eb3a18a-b943-4996-8977-8c442eca7e9e\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lp45r" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.282716 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmdwc\" (UniqueName: \"kubernetes.io/projected/75ccbb22-479b-4415-aa0d-c00853a463ee-kube-api-access-pmdwc\") pod \"nmstate-webhook-5f6d4c5ccb-7g6km\" (UID: \"75ccbb22-479b-4415-aa0d-c00853a463ee\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.282853 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dvg7\" (UniqueName: \"kubernetes.io/projected/92357f02-ea29-48b6-b763-f6e1b8ca3457-kube-api-access-7dvg7\") pod \"nmstate-handler-szl2t\" (UID: \"92357f02-ea29-48b6-b763-f6e1b8ca3457\") " pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.366939 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/060d894f-3918-4f8c-8b70-33d7e18b316d-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-kj52s\" (UID: \"060d894f-3918-4f8c-8b70-33d7e18b316d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.367299 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp26n\" (UniqueName: \"kubernetes.io/projected/060d894f-3918-4f8c-8b70-33d7e18b316d-kube-api-access-vp26n\") pod \"nmstate-console-plugin-7fbb5f6569-kj52s\" (UID: \"060d894f-3918-4f8c-8b70-33d7e18b316d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.367486 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/060d894f-3918-4f8c-8b70-33d7e18b316d-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kj52s\" (UID: \"060d894f-3918-4f8c-8b70-33d7e18b316d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" Nov 29 05:38:10 crc kubenswrapper[4594]: E1129 05:38:10.367832 4594 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Nov 29 05:38:10 crc kubenswrapper[4594]: E1129 05:38:10.368014 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/060d894f-3918-4f8c-8b70-33d7e18b316d-plugin-serving-cert podName:060d894f-3918-4f8c-8b70-33d7e18b316d nodeName:}" failed. No retries permitted until 2025-11-29 05:38:10.867964515 +0000 UTC m=+615.108473734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/060d894f-3918-4f8c-8b70-33d7e18b316d-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-kj52s" (UID: "060d894f-3918-4f8c-8b70-33d7e18b316d") : secret "plugin-serving-cert" not found Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.368921 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/060d894f-3918-4f8c-8b70-33d7e18b316d-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-kj52s\" (UID: \"060d894f-3918-4f8c-8b70-33d7e18b316d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.374338 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5dc67c566d-z5z72"] Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.375349 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.384534 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dc67c566d-z5z72"] Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.384775 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp26n\" (UniqueName: \"kubernetes.io/projected/060d894f-3918-4f8c-8b70-33d7e18b316d-kube-api-access-vp26n\") pod \"nmstate-console-plugin-7fbb5f6569-kj52s\" (UID: \"060d894f-3918-4f8c-8b70-33d7e18b316d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.421862 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lp45r" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.431796 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.446703 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.467975 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3306e10-9321-423b-90c8-97715d2e3122-oauth-serving-cert\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.468028 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3306e10-9321-423b-90c8-97715d2e3122-console-oauth-config\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.468050 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3306e10-9321-423b-90c8-97715d2e3122-console-serving-cert\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.468066 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3306e10-9321-423b-90c8-97715d2e3122-service-ca\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.468097 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3306e10-9321-423b-90c8-97715d2e3122-console-config\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.468812 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3306e10-9321-423b-90c8-97715d2e3122-trusted-ca-bundle\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.468967 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c642\" (UniqueName: \"kubernetes.io/projected/d3306e10-9321-423b-90c8-97715d2e3122-kube-api-access-4c642\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.570231 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3306e10-9321-423b-90c8-97715d2e3122-console-oauth-config\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.570287 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3306e10-9321-423b-90c8-97715d2e3122-console-serving-cert\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.570306 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3306e10-9321-423b-90c8-97715d2e3122-service-ca\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.570337 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3306e10-9321-423b-90c8-97715d2e3122-console-config\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.570365 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3306e10-9321-423b-90c8-97715d2e3122-trusted-ca-bundle\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.570403 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c642\" (UniqueName: \"kubernetes.io/projected/d3306e10-9321-423b-90c8-97715d2e3122-kube-api-access-4c642\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.570437 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3306e10-9321-423b-90c8-97715d2e3122-oauth-serving-cert\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.571227 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3306e10-9321-423b-90c8-97715d2e3122-oauth-serving-cert\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.571783 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3306e10-9321-423b-90c8-97715d2e3122-service-ca\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.571788 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3306e10-9321-423b-90c8-97715d2e3122-console-config\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.572957 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3306e10-9321-423b-90c8-97715d2e3122-trusted-ca-bundle\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.574793 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3306e10-9321-423b-90c8-97715d2e3122-console-oauth-config\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.574980 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3306e10-9321-423b-90c8-97715d2e3122-console-serving-cert\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.589950 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c642\" (UniqueName: \"kubernetes.io/projected/d3306e10-9321-423b-90c8-97715d2e3122-kube-api-access-4c642\") pod \"console-5dc67c566d-z5z72\" (UID: \"d3306e10-9321-423b-90c8-97715d2e3122\") " pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.730829 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.753377 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km"] Nov 29 05:38:10 crc kubenswrapper[4594]: W1129 05:38:10.757564 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75ccbb22_479b_4415_aa0d_c00853a463ee.slice/crio-4ed751803a92d6981486a89c8e9822b7ca0c11b72c75f1f412e194bc8f7ad14a WatchSource:0}: Error finding container 4ed751803a92d6981486a89c8e9822b7ca0c11b72c75f1f412e194bc8f7ad14a: Status 404 returned error can't find the container with id 4ed751803a92d6981486a89c8e9822b7ca0c11b72c75f1f412e194bc8f7ad14a Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.874188 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/060d894f-3918-4f8c-8b70-33d7e18b316d-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kj52s\" (UID: \"060d894f-3918-4f8c-8b70-33d7e18b316d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.880752 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/060d894f-3918-4f8c-8b70-33d7e18b316d-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kj52s\" (UID: \"060d894f-3918-4f8c-8b70-33d7e18b316d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.887083 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dc67c566d-z5z72"] Nov 29 05:38:10 crc kubenswrapper[4594]: I1129 05:38:10.915390 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-lp45r"] Nov 29 05:38:10 crc kubenswrapper[4594]: W1129 05:38:10.923194 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eb3a18a_b943_4996_8977_8c442eca7e9e.slice/crio-c5b030d7d4b337ebb50bd8e37ea748f5f4bb863dc745eb062e707a55364a4538 WatchSource:0}: Error finding container c5b030d7d4b337ebb50bd8e37ea748f5f4bb863dc745eb062e707a55364a4538: Status 404 returned error can't find the container with id c5b030d7d4b337ebb50bd8e37ea748f5f4bb863dc745eb062e707a55364a4538 Nov 29 05:38:11 crc kubenswrapper[4594]: I1129 05:38:11.095038 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dc67c566d-z5z72" event={"ID":"d3306e10-9321-423b-90c8-97715d2e3122","Type":"ContainerStarted","Data":"7dacbb43863a0907bf94c67a0754cca6cec782295117a7595b468133c7ebfa0a"} Nov 29 05:38:11 crc kubenswrapper[4594]: I1129 05:38:11.095328 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dc67c566d-z5z72" event={"ID":"d3306e10-9321-423b-90c8-97715d2e3122","Type":"ContainerStarted","Data":"796598ee80bdeda9d7a48a390a556a5560560b836b58fd032aa0d8e54f39995a"} Nov 29 05:38:11 crc kubenswrapper[4594]: I1129 05:38:11.096471 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-szl2t" event={"ID":"92357f02-ea29-48b6-b763-f6e1b8ca3457","Type":"ContainerStarted","Data":"3a3676c9549c57184549dc90534a8111338b28aa9cbdb13ccc80ad7abba36e7f"} Nov 29 05:38:11 crc kubenswrapper[4594]: I1129 05:38:11.097814 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lp45r" event={"ID":"4eb3a18a-b943-4996-8977-8c442eca7e9e","Type":"ContainerStarted","Data":"c5b030d7d4b337ebb50bd8e37ea748f5f4bb863dc745eb062e707a55364a4538"} Nov 29 05:38:11 crc kubenswrapper[4594]: I1129 05:38:11.099246 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km" event={"ID":"75ccbb22-479b-4415-aa0d-c00853a463ee","Type":"ContainerStarted","Data":"4ed751803a92d6981486a89c8e9822b7ca0c11b72c75f1f412e194bc8f7ad14a"} Nov 29 05:38:11 crc kubenswrapper[4594]: I1129 05:38:11.109458 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" Nov 29 05:38:11 crc kubenswrapper[4594]: I1129 05:38:11.110715 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5dc67c566d-z5z72" podStartSLOduration=1.110691684 podStartE2EDuration="1.110691684s" podCreationTimestamp="2025-11-29 05:38:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:38:11.108594351 +0000 UTC m=+615.349103581" watchObservedRunningTime="2025-11-29 05:38:11.110691684 +0000 UTC m=+615.351200905" Nov 29 05:38:11 crc kubenswrapper[4594]: I1129 05:38:11.293359 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s"] Nov 29 05:38:12 crc kubenswrapper[4594]: I1129 05:38:12.110498 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" event={"ID":"060d894f-3918-4f8c-8b70-33d7e18b316d","Type":"ContainerStarted","Data":"0b157c771b65bec37022c72192d63c19d006b08cc723106927af0460f425d4ef"} Nov 29 05:38:14 crc kubenswrapper[4594]: I1129 05:38:14.142090 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-szl2t" event={"ID":"92357f02-ea29-48b6-b763-f6e1b8ca3457","Type":"ContainerStarted","Data":"818c1245aee23c32fcafcd4edcb77ec2ee89b308a7395350164878ebc621772d"} Nov 29 05:38:14 crc kubenswrapper[4594]: I1129 05:38:14.142680 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:14 crc kubenswrapper[4594]: I1129 05:38:14.143885 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lp45r" event={"ID":"4eb3a18a-b943-4996-8977-8c442eca7e9e","Type":"ContainerStarted","Data":"fda0e4127be60f8a8ade8be0d37fd6fff1919376dfbe1018c9d7aad8d6bd1e71"} Nov 29 05:38:14 crc kubenswrapper[4594]: I1129 05:38:14.145843 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km" event={"ID":"75ccbb22-479b-4415-aa0d-c00853a463ee","Type":"ContainerStarted","Data":"7e75858b75fccabb6c0f5abf5a414a662a44a2df5e796fa605523af203013128"} Nov 29 05:38:14 crc kubenswrapper[4594]: I1129 05:38:14.146537 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km" Nov 29 05:38:14 crc kubenswrapper[4594]: I1129 05:38:14.160453 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-szl2t" podStartSLOduration=1.27278144 podStartE2EDuration="4.160438709s" podCreationTimestamp="2025-11-29 05:38:10 +0000 UTC" firstStartedPulling="2025-11-29 05:38:10.556152927 +0000 UTC m=+614.796662147" lastFinishedPulling="2025-11-29 05:38:13.443810195 +0000 UTC m=+617.684319416" observedRunningTime="2025-11-29 05:38:14.156068581 +0000 UTC m=+618.396577801" watchObservedRunningTime="2025-11-29 05:38:14.160438709 +0000 UTC m=+618.400947929" Nov 29 05:38:14 crc kubenswrapper[4594]: I1129 05:38:14.176073 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km" podStartSLOduration=1.497450127 podStartE2EDuration="4.176059606s" podCreationTimestamp="2025-11-29 05:38:10 +0000 UTC" firstStartedPulling="2025-11-29 05:38:10.761059361 +0000 UTC m=+615.001568581" lastFinishedPulling="2025-11-29 05:38:13.439668839 +0000 UTC m=+617.680178060" observedRunningTime="2025-11-29 05:38:14.167096826 +0000 UTC m=+618.407606046" watchObservedRunningTime="2025-11-29 05:38:14.176059606 +0000 UTC m=+618.416568827" Nov 29 05:38:15 crc kubenswrapper[4594]: I1129 05:38:15.155198 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" event={"ID":"060d894f-3918-4f8c-8b70-33d7e18b316d","Type":"ContainerStarted","Data":"a5b02e40d545c3219edf0bd762e54a1651b1080006e4de3d0a022194d55b2f30"} Nov 29 05:38:15 crc kubenswrapper[4594]: I1129 05:38:15.172248 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kj52s" podStartSLOduration=2.194310755 podStartE2EDuration="5.1722327s" podCreationTimestamp="2025-11-29 05:38:10 +0000 UTC" firstStartedPulling="2025-11-29 05:38:11.299715959 +0000 UTC m=+615.540225179" lastFinishedPulling="2025-11-29 05:38:14.277637904 +0000 UTC m=+618.518147124" observedRunningTime="2025-11-29 05:38:15.170579402 +0000 UTC m=+619.411088622" watchObservedRunningTime="2025-11-29 05:38:15.1722327 +0000 UTC m=+619.412741920" Nov 29 05:38:15 crc kubenswrapper[4594]: I1129 05:38:15.800659 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:38:15 crc kubenswrapper[4594]: I1129 05:38:15.801014 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:38:16 crc kubenswrapper[4594]: I1129 05:38:16.162171 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lp45r" event={"ID":"4eb3a18a-b943-4996-8977-8c442eca7e9e","Type":"ContainerStarted","Data":"602d4ececaeaeeb7774d69da505ed06862fb22b8ca9993ae758b9da756aaf3b2"} Nov 29 05:38:16 crc kubenswrapper[4594]: I1129 05:38:16.179137 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-lp45r" podStartSLOduration=1.299688267 podStartE2EDuration="6.179121761s" podCreationTimestamp="2025-11-29 05:38:10 +0000 UTC" firstStartedPulling="2025-11-29 05:38:10.926983214 +0000 UTC m=+615.167492434" lastFinishedPulling="2025-11-29 05:38:15.806416718 +0000 UTC m=+620.046925928" observedRunningTime="2025-11-29 05:38:16.176150143 +0000 UTC m=+620.416659363" watchObservedRunningTime="2025-11-29 05:38:16.179121761 +0000 UTC m=+620.419630981" Nov 29 05:38:20 crc kubenswrapper[4594]: I1129 05:38:20.466611 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-szl2t" Nov 29 05:38:20 crc kubenswrapper[4594]: I1129 05:38:20.731723 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:20 crc kubenswrapper[4594]: I1129 05:38:20.732277 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:20 crc kubenswrapper[4594]: I1129 05:38:20.736636 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:21 crc kubenswrapper[4594]: I1129 05:38:21.195040 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5dc67c566d-z5z72" Nov 29 05:38:21 crc kubenswrapper[4594]: I1129 05:38:21.234068 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9ctf8"] Nov 29 05:38:30 crc kubenswrapper[4594]: I1129 05:38:30.437004 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7g6km" Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.574426 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s"] Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.576219 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.578098 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.582972 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s"] Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.696275 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s\" (UID: \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.696712 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfsfv\" (UniqueName: \"kubernetes.io/projected/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-kube-api-access-hfsfv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s\" (UID: \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.696774 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s\" (UID: \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.798601 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfsfv\" (UniqueName: \"kubernetes.io/projected/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-kube-api-access-hfsfv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s\" (UID: \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.798742 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s\" (UID: \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.798840 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s\" (UID: \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.799353 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s\" (UID: \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.799722 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s\" (UID: \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.818883 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfsfv\" (UniqueName: \"kubernetes.io/projected/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-kube-api-access-hfsfv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s\" (UID: \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" Nov 29 05:38:42 crc kubenswrapper[4594]: I1129 05:38:42.890039 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" Nov 29 05:38:43 crc kubenswrapper[4594]: I1129 05:38:43.287887 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s"] Nov 29 05:38:43 crc kubenswrapper[4594]: I1129 05:38:43.321993 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" event={"ID":"7a1d137f-f3e9-4543-966c-f5cfe3b3360d","Type":"ContainerStarted","Data":"fbd28ae6c5a6e37c8dd0168b7fb2cee64e9e5f9ed1ff063f05432505937bb062"} Nov 29 05:38:44 crc kubenswrapper[4594]: I1129 05:38:44.327808 4594 generic.go:334] "Generic (PLEG): container finished" podID="7a1d137f-f3e9-4543-966c-f5cfe3b3360d" containerID="bb77ad032d2b5824dd2d4dd9155fa00c6c63a9ec1b39bc00c6c8a1b851c3d9ff" exitCode=0 Nov 29 05:38:44 crc kubenswrapper[4594]: I1129 05:38:44.327858 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" event={"ID":"7a1d137f-f3e9-4543-966c-f5cfe3b3360d","Type":"ContainerDied","Data":"bb77ad032d2b5824dd2d4dd9155fa00c6c63a9ec1b39bc00c6c8a1b851c3d9ff"} Nov 29 05:38:45 crc kubenswrapper[4594]: I1129 05:38:45.800413 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:38:45 crc kubenswrapper[4594]: I1129 05:38:45.800742 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:38:45 crc kubenswrapper[4594]: I1129 05:38:45.800793 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:38:45 crc kubenswrapper[4594]: I1129 05:38:45.801370 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7993c3c5c143399d1d6e2423a11b6f37b4521819ab268ad6d3a203f1a0f238a6"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 05:38:45 crc kubenswrapper[4594]: I1129 05:38:45.801435 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://7993c3c5c143399d1d6e2423a11b6f37b4521819ab268ad6d3a203f1a0f238a6" gracePeriod=600 Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.270185 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-9ctf8" podUID="12b5360d-755a-4cb5-9ef3-0c00550e3913" containerName="console" containerID="cri-o://7a2b3e3c27d2cbe26cc0c1aa79a95cb164547951b18c33927ca0d9e0b8aad23c" gracePeriod=15 Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.353839 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="7993c3c5c143399d1d6e2423a11b6f37b4521819ab268ad6d3a203f1a0f238a6" exitCode=0 Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.353916 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"7993c3c5c143399d1d6e2423a11b6f37b4521819ab268ad6d3a203f1a0f238a6"} Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.353973 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"3d6477f346180f42d9b7737089f66087fbd7adf1dc0feaf01168195b349570d2"} Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.354000 4594 scope.go:117] "RemoveContainer" containerID="8ca701b866c00b1fe102c52d2dacfbb9215d913bbee855fff76b7d8b7e0bf21d" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.362873 4594 generic.go:334] "Generic (PLEG): container finished" podID="7a1d137f-f3e9-4543-966c-f5cfe3b3360d" containerID="6b3e2cc736c08337ff83d18bb2b710d86c5e87a4dd10333c16225a57192eead7" exitCode=0 Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.362934 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" event={"ID":"7a1d137f-f3e9-4543-966c-f5cfe3b3360d","Type":"ContainerDied","Data":"6b3e2cc736c08337ff83d18bb2b710d86c5e87a4dd10333c16225a57192eead7"} Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.641758 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9ctf8_12b5360d-755a-4cb5-9ef3-0c00550e3913/console/0.log" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.642069 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.750938 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-config\") pod \"12b5360d-755a-4cb5-9ef3-0c00550e3913\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.751041 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-serving-cert\") pod \"12b5360d-755a-4cb5-9ef3-0c00550e3913\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.751098 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-oauth-config\") pod \"12b5360d-755a-4cb5-9ef3-0c00550e3913\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.751146 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-oauth-serving-cert\") pod \"12b5360d-755a-4cb5-9ef3-0c00550e3913\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.751217 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-service-ca\") pod \"12b5360d-755a-4cb5-9ef3-0c00550e3913\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.751313 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-trusted-ca-bundle\") pod \"12b5360d-755a-4cb5-9ef3-0c00550e3913\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.751358 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbpfn\" (UniqueName: \"kubernetes.io/projected/12b5360d-755a-4cb5-9ef3-0c00550e3913-kube-api-access-lbpfn\") pod \"12b5360d-755a-4cb5-9ef3-0c00550e3913\" (UID: \"12b5360d-755a-4cb5-9ef3-0c00550e3913\") " Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.752087 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-service-ca" (OuterVolumeSpecName: "service-ca") pod "12b5360d-755a-4cb5-9ef3-0c00550e3913" (UID: "12b5360d-755a-4cb5-9ef3-0c00550e3913"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.752119 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "12b5360d-755a-4cb5-9ef3-0c00550e3913" (UID: "12b5360d-755a-4cb5-9ef3-0c00550e3913"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.752142 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "12b5360d-755a-4cb5-9ef3-0c00550e3913" (UID: "12b5360d-755a-4cb5-9ef3-0c00550e3913"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.752580 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-config" (OuterVolumeSpecName: "console-config") pod "12b5360d-755a-4cb5-9ef3-0c00550e3913" (UID: "12b5360d-755a-4cb5-9ef3-0c00550e3913"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.758198 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "12b5360d-755a-4cb5-9ef3-0c00550e3913" (UID: "12b5360d-755a-4cb5-9ef3-0c00550e3913"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.758193 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b5360d-755a-4cb5-9ef3-0c00550e3913-kube-api-access-lbpfn" (OuterVolumeSpecName: "kube-api-access-lbpfn") pod "12b5360d-755a-4cb5-9ef3-0c00550e3913" (UID: "12b5360d-755a-4cb5-9ef3-0c00550e3913"). InnerVolumeSpecName "kube-api-access-lbpfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.758652 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "12b5360d-755a-4cb5-9ef3-0c00550e3913" (UID: "12b5360d-755a-4cb5-9ef3-0c00550e3913"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.854053 4594 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.854085 4594 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.854099 4594 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12b5360d-755a-4cb5-9ef3-0c00550e3913-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.854111 4594 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.854121 4594 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.854131 4594 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12b5360d-755a-4cb5-9ef3-0c00550e3913-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:38:46 crc kubenswrapper[4594]: I1129 05:38:46.854144 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbpfn\" (UniqueName: \"kubernetes.io/projected/12b5360d-755a-4cb5-9ef3-0c00550e3913-kube-api-access-lbpfn\") on node \"crc\" DevicePath \"\"" Nov 29 05:38:47 crc kubenswrapper[4594]: I1129 05:38:47.373484 4594 generic.go:334] "Generic (PLEG): container finished" podID="7a1d137f-f3e9-4543-966c-f5cfe3b3360d" containerID="d1c995a28876b7b077cd932dfccec6ed2cabd2770b2b29f09521664f6daad297" exitCode=0 Nov 29 05:38:47 crc kubenswrapper[4594]: I1129 05:38:47.373582 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" event={"ID":"7a1d137f-f3e9-4543-966c-f5cfe3b3360d","Type":"ContainerDied","Data":"d1c995a28876b7b077cd932dfccec6ed2cabd2770b2b29f09521664f6daad297"} Nov 29 05:38:47 crc kubenswrapper[4594]: I1129 05:38:47.374859 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9ctf8_12b5360d-755a-4cb5-9ef3-0c00550e3913/console/0.log" Nov 29 05:38:47 crc kubenswrapper[4594]: I1129 05:38:47.374903 4594 generic.go:334] "Generic (PLEG): container finished" podID="12b5360d-755a-4cb5-9ef3-0c00550e3913" containerID="7a2b3e3c27d2cbe26cc0c1aa79a95cb164547951b18c33927ca0d9e0b8aad23c" exitCode=2 Nov 29 05:38:47 crc kubenswrapper[4594]: I1129 05:38:47.374932 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9ctf8" event={"ID":"12b5360d-755a-4cb5-9ef3-0c00550e3913","Type":"ContainerDied","Data":"7a2b3e3c27d2cbe26cc0c1aa79a95cb164547951b18c33927ca0d9e0b8aad23c"} Nov 29 05:38:47 crc kubenswrapper[4594]: I1129 05:38:47.374944 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9ctf8" Nov 29 05:38:47 crc kubenswrapper[4594]: I1129 05:38:47.374950 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9ctf8" event={"ID":"12b5360d-755a-4cb5-9ef3-0c00550e3913","Type":"ContainerDied","Data":"774a9707d2d257f0518eff54b02d61e96f981ec1554a9e832dfc528e026992af"} Nov 29 05:38:47 crc kubenswrapper[4594]: I1129 05:38:47.374971 4594 scope.go:117] "RemoveContainer" containerID="7a2b3e3c27d2cbe26cc0c1aa79a95cb164547951b18c33927ca0d9e0b8aad23c" Nov 29 05:38:47 crc kubenswrapper[4594]: I1129 05:38:47.393621 4594 scope.go:117] "RemoveContainer" containerID="7a2b3e3c27d2cbe26cc0c1aa79a95cb164547951b18c33927ca0d9e0b8aad23c" Nov 29 05:38:47 crc kubenswrapper[4594]: E1129 05:38:47.394552 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a2b3e3c27d2cbe26cc0c1aa79a95cb164547951b18c33927ca0d9e0b8aad23c\": container with ID starting with 7a2b3e3c27d2cbe26cc0c1aa79a95cb164547951b18c33927ca0d9e0b8aad23c not found: ID does not exist" containerID="7a2b3e3c27d2cbe26cc0c1aa79a95cb164547951b18c33927ca0d9e0b8aad23c" Nov 29 05:38:47 crc kubenswrapper[4594]: I1129 05:38:47.394585 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a2b3e3c27d2cbe26cc0c1aa79a95cb164547951b18c33927ca0d9e0b8aad23c"} err="failed to get container status \"7a2b3e3c27d2cbe26cc0c1aa79a95cb164547951b18c33927ca0d9e0b8aad23c\": rpc error: code = NotFound desc = could not find container \"7a2b3e3c27d2cbe26cc0c1aa79a95cb164547951b18c33927ca0d9e0b8aad23c\": container with ID starting with 7a2b3e3c27d2cbe26cc0c1aa79a95cb164547951b18c33927ca0d9e0b8aad23c not found: ID does not exist" Nov 29 05:38:47 crc kubenswrapper[4594]: I1129 05:38:47.404891 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9ctf8"] Nov 29 05:38:47 crc kubenswrapper[4594]: I1129 05:38:47.407640 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-9ctf8"] Nov 29 05:38:48 crc kubenswrapper[4594]: I1129 05:38:48.090649 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b5360d-755a-4cb5-9ef3-0c00550e3913" path="/var/lib/kubelet/pods/12b5360d-755a-4cb5-9ef3-0c00550e3913/volumes" Nov 29 05:38:48 crc kubenswrapper[4594]: I1129 05:38:48.601969 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" Nov 29 05:38:48 crc kubenswrapper[4594]: I1129 05:38:48.776584 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-util\") pod \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\" (UID: \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\") " Nov 29 05:38:48 crc kubenswrapper[4594]: I1129 05:38:48.776680 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfsfv\" (UniqueName: \"kubernetes.io/projected/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-kube-api-access-hfsfv\") pod \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\" (UID: \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\") " Nov 29 05:38:48 crc kubenswrapper[4594]: I1129 05:38:48.776726 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-bundle\") pod \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\" (UID: \"7a1d137f-f3e9-4543-966c-f5cfe3b3360d\") " Nov 29 05:38:48 crc kubenswrapper[4594]: I1129 05:38:48.777805 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-bundle" (OuterVolumeSpecName: "bundle") pod "7a1d137f-f3e9-4543-966c-f5cfe3b3360d" (UID: "7a1d137f-f3e9-4543-966c-f5cfe3b3360d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:38:48 crc kubenswrapper[4594]: I1129 05:38:48.784124 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-kube-api-access-hfsfv" (OuterVolumeSpecName: "kube-api-access-hfsfv") pod "7a1d137f-f3e9-4543-966c-f5cfe3b3360d" (UID: "7a1d137f-f3e9-4543-966c-f5cfe3b3360d"). InnerVolumeSpecName "kube-api-access-hfsfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:38:48 crc kubenswrapper[4594]: I1129 05:38:48.878167 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfsfv\" (UniqueName: \"kubernetes.io/projected/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-kube-api-access-hfsfv\") on node \"crc\" DevicePath \"\"" Nov 29 05:38:48 crc kubenswrapper[4594]: I1129 05:38:48.878195 4594 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:38:48 crc kubenswrapper[4594]: I1129 05:38:48.930659 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-util" (OuterVolumeSpecName: "util") pod "7a1d137f-f3e9-4543-966c-f5cfe3b3360d" (UID: "7a1d137f-f3e9-4543-966c-f5cfe3b3360d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:38:48 crc kubenswrapper[4594]: I1129 05:38:48.979387 4594 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a1d137f-f3e9-4543-966c-f5cfe3b3360d-util\") on node \"crc\" DevicePath \"\"" Nov 29 05:38:49 crc kubenswrapper[4594]: I1129 05:38:49.390547 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" event={"ID":"7a1d137f-f3e9-4543-966c-f5cfe3b3360d","Type":"ContainerDied","Data":"fbd28ae6c5a6e37c8dd0168b7fb2cee64e9e5f9ed1ff063f05432505937bb062"} Nov 29 05:38:49 crc kubenswrapper[4594]: I1129 05:38:49.390605 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbd28ae6c5a6e37c8dd0168b7fb2cee64e9e5f9ed1ff063f05432505937bb062" Nov 29 05:38:49 crc kubenswrapper[4594]: I1129 05:38:49.390618 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.462954 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-79b995c45-klk7s"] Nov 29 05:38:59 crc kubenswrapper[4594]: E1129 05:38:59.463745 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1d137f-f3e9-4543-966c-f5cfe3b3360d" containerName="util" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.463760 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1d137f-f3e9-4543-966c-f5cfe3b3360d" containerName="util" Nov 29 05:38:59 crc kubenswrapper[4594]: E1129 05:38:59.463777 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1d137f-f3e9-4543-966c-f5cfe3b3360d" containerName="extract" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.463783 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1d137f-f3e9-4543-966c-f5cfe3b3360d" containerName="extract" Nov 29 05:38:59 crc kubenswrapper[4594]: E1129 05:38:59.463795 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1d137f-f3e9-4543-966c-f5cfe3b3360d" containerName="pull" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.463800 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1d137f-f3e9-4543-966c-f5cfe3b3360d" containerName="pull" Nov 29 05:38:59 crc kubenswrapper[4594]: E1129 05:38:59.463812 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b5360d-755a-4cb5-9ef3-0c00550e3913" containerName="console" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.463817 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b5360d-755a-4cb5-9ef3-0c00550e3913" containerName="console" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.463918 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1d137f-f3e9-4543-966c-f5cfe3b3360d" containerName="extract" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.463931 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b5360d-755a-4cb5-9ef3-0c00550e3913" containerName="console" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.464363 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.466305 4594 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.466873 4594 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.466979 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.467448 4594 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-d9tsx" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.467717 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.483462 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79b995c45-klk7s"] Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.522537 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1369887-80c2-44ef-b566-f30184ea9607-apiservice-cert\") pod \"metallb-operator-controller-manager-79b995c45-klk7s\" (UID: \"a1369887-80c2-44ef-b566-f30184ea9607\") " pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.522589 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1369887-80c2-44ef-b566-f30184ea9607-webhook-cert\") pod \"metallb-operator-controller-manager-79b995c45-klk7s\" (UID: \"a1369887-80c2-44ef-b566-f30184ea9607\") " pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.522622 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4ljm\" (UniqueName: \"kubernetes.io/projected/a1369887-80c2-44ef-b566-f30184ea9607-kube-api-access-j4ljm\") pod \"metallb-operator-controller-manager-79b995c45-klk7s\" (UID: \"a1369887-80c2-44ef-b566-f30184ea9607\") " pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.624032 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1369887-80c2-44ef-b566-f30184ea9607-apiservice-cert\") pod \"metallb-operator-controller-manager-79b995c45-klk7s\" (UID: \"a1369887-80c2-44ef-b566-f30184ea9607\") " pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.624244 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1369887-80c2-44ef-b566-f30184ea9607-webhook-cert\") pod \"metallb-operator-controller-manager-79b995c45-klk7s\" (UID: \"a1369887-80c2-44ef-b566-f30184ea9607\") " pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.624363 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ljm\" (UniqueName: \"kubernetes.io/projected/a1369887-80c2-44ef-b566-f30184ea9607-kube-api-access-j4ljm\") pod \"metallb-operator-controller-manager-79b995c45-klk7s\" (UID: \"a1369887-80c2-44ef-b566-f30184ea9607\") " pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.632217 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1369887-80c2-44ef-b566-f30184ea9607-apiservice-cert\") pod \"metallb-operator-controller-manager-79b995c45-klk7s\" (UID: \"a1369887-80c2-44ef-b566-f30184ea9607\") " pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.641736 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1369887-80c2-44ef-b566-f30184ea9607-webhook-cert\") pod \"metallb-operator-controller-manager-79b995c45-klk7s\" (UID: \"a1369887-80c2-44ef-b566-f30184ea9607\") " pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.648919 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4ljm\" (UniqueName: \"kubernetes.io/projected/a1369887-80c2-44ef-b566-f30184ea9607-kube-api-access-j4ljm\") pod \"metallb-operator-controller-manager-79b995c45-klk7s\" (UID: \"a1369887-80c2-44ef-b566-f30184ea9607\") " pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.783048 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.826241 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp"] Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.827243 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.832742 4594 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.837814 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp"] Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.839849 4594 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-ctnzc" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.840185 4594 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.927758 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce5a6997-d8a6-489f-9bcf-d77879d7ad46-webhook-cert\") pod \"metallb-operator-webhook-server-7667cbc88-mqfqp\" (UID: \"ce5a6997-d8a6-489f-9bcf-d77879d7ad46\") " pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.927808 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs5h2\" (UniqueName: \"kubernetes.io/projected/ce5a6997-d8a6-489f-9bcf-d77879d7ad46-kube-api-access-xs5h2\") pod \"metallb-operator-webhook-server-7667cbc88-mqfqp\" (UID: \"ce5a6997-d8a6-489f-9bcf-d77879d7ad46\") " pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.927848 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce5a6997-d8a6-489f-9bcf-d77879d7ad46-apiservice-cert\") pod \"metallb-operator-webhook-server-7667cbc88-mqfqp\" (UID: \"ce5a6997-d8a6-489f-9bcf-d77879d7ad46\") " pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" Nov 29 05:38:59 crc kubenswrapper[4594]: I1129 05:38:59.992854 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79b995c45-klk7s"] Nov 29 05:39:00 crc kubenswrapper[4594]: W1129 05:39:00.002970 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1369887_80c2_44ef_b566_f30184ea9607.slice/crio-711e57c984941216fb95f66519f3afc31329ca9a95a04d2de58971b8133c4cb5 WatchSource:0}: Error finding container 711e57c984941216fb95f66519f3afc31329ca9a95a04d2de58971b8133c4cb5: Status 404 returned error can't find the container with id 711e57c984941216fb95f66519f3afc31329ca9a95a04d2de58971b8133c4cb5 Nov 29 05:39:00 crc kubenswrapper[4594]: I1129 05:39:00.029566 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce5a6997-d8a6-489f-9bcf-d77879d7ad46-webhook-cert\") pod \"metallb-operator-webhook-server-7667cbc88-mqfqp\" (UID: \"ce5a6997-d8a6-489f-9bcf-d77879d7ad46\") " pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" Nov 29 05:39:00 crc kubenswrapper[4594]: I1129 05:39:00.029628 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs5h2\" (UniqueName: \"kubernetes.io/projected/ce5a6997-d8a6-489f-9bcf-d77879d7ad46-kube-api-access-xs5h2\") pod \"metallb-operator-webhook-server-7667cbc88-mqfqp\" (UID: \"ce5a6997-d8a6-489f-9bcf-d77879d7ad46\") " pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" Nov 29 05:39:00 crc kubenswrapper[4594]: I1129 05:39:00.029695 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce5a6997-d8a6-489f-9bcf-d77879d7ad46-apiservice-cert\") pod \"metallb-operator-webhook-server-7667cbc88-mqfqp\" (UID: \"ce5a6997-d8a6-489f-9bcf-d77879d7ad46\") " pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" Nov 29 05:39:00 crc kubenswrapper[4594]: I1129 05:39:00.035337 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce5a6997-d8a6-489f-9bcf-d77879d7ad46-apiservice-cert\") pod \"metallb-operator-webhook-server-7667cbc88-mqfqp\" (UID: \"ce5a6997-d8a6-489f-9bcf-d77879d7ad46\") " pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" Nov 29 05:39:00 crc kubenswrapper[4594]: I1129 05:39:00.035428 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce5a6997-d8a6-489f-9bcf-d77879d7ad46-webhook-cert\") pod \"metallb-operator-webhook-server-7667cbc88-mqfqp\" (UID: \"ce5a6997-d8a6-489f-9bcf-d77879d7ad46\") " pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" Nov 29 05:39:00 crc kubenswrapper[4594]: I1129 05:39:00.046740 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs5h2\" (UniqueName: \"kubernetes.io/projected/ce5a6997-d8a6-489f-9bcf-d77879d7ad46-kube-api-access-xs5h2\") pod \"metallb-operator-webhook-server-7667cbc88-mqfqp\" (UID: \"ce5a6997-d8a6-489f-9bcf-d77879d7ad46\") " pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" Nov 29 05:39:00 crc kubenswrapper[4594]: I1129 05:39:00.141875 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" Nov 29 05:39:00 crc kubenswrapper[4594]: I1129 05:39:00.452575 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" event={"ID":"a1369887-80c2-44ef-b566-f30184ea9607","Type":"ContainerStarted","Data":"711e57c984941216fb95f66519f3afc31329ca9a95a04d2de58971b8133c4cb5"} Nov 29 05:39:00 crc kubenswrapper[4594]: I1129 05:39:00.552439 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp"] Nov 29 05:39:00 crc kubenswrapper[4594]: W1129 05:39:00.560286 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce5a6997_d8a6_489f_9bcf_d77879d7ad46.slice/crio-296eb37f624e6b74e0e1dcdce74f41cf2a8da9e1e39651c1521c9ef75fc97041 WatchSource:0}: Error finding container 296eb37f624e6b74e0e1dcdce74f41cf2a8da9e1e39651c1521c9ef75fc97041: Status 404 returned error can't find the container with id 296eb37f624e6b74e0e1dcdce74f41cf2a8da9e1e39651c1521c9ef75fc97041 Nov 29 05:39:01 crc kubenswrapper[4594]: I1129 05:39:01.459442 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" event={"ID":"ce5a6997-d8a6-489f-9bcf-d77879d7ad46","Type":"ContainerStarted","Data":"296eb37f624e6b74e0e1dcdce74f41cf2a8da9e1e39651c1521c9ef75fc97041"} Nov 29 05:39:04 crc kubenswrapper[4594]: I1129 05:39:04.479361 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" event={"ID":"a1369887-80c2-44ef-b566-f30184ea9607","Type":"ContainerStarted","Data":"0216025af2c4793335468457cb7c530462f2e17357f65c196994451e154a3356"} Nov 29 05:39:04 crc kubenswrapper[4594]: I1129 05:39:04.480491 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" Nov 29 05:39:06 crc kubenswrapper[4594]: I1129 05:39:06.107002 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" podStartSLOduration=3.259537255 podStartE2EDuration="7.106984632s" podCreationTimestamp="2025-11-29 05:38:59 +0000 UTC" firstStartedPulling="2025-11-29 05:39:00.005924031 +0000 UTC m=+664.246433251" lastFinishedPulling="2025-11-29 05:39:03.853371409 +0000 UTC m=+668.093880628" observedRunningTime="2025-11-29 05:39:04.497422915 +0000 UTC m=+668.737932146" watchObservedRunningTime="2025-11-29 05:39:06.106984632 +0000 UTC m=+670.347493852" Nov 29 05:39:06 crc kubenswrapper[4594]: I1129 05:39:06.497891 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" event={"ID":"ce5a6997-d8a6-489f-9bcf-d77879d7ad46","Type":"ContainerStarted","Data":"33de48012ef3cec20bd4b1df4f67df661f8ac1dcc5621a669f5e0f18f7e8ac8b"} Nov 29 05:39:06 crc kubenswrapper[4594]: I1129 05:39:06.498053 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" Nov 29 05:39:06 crc kubenswrapper[4594]: I1129 05:39:06.510914 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" podStartSLOduration=2.580322666 podStartE2EDuration="7.510902473s" podCreationTimestamp="2025-11-29 05:38:59 +0000 UTC" firstStartedPulling="2025-11-29 05:39:00.56375164 +0000 UTC m=+664.804260860" lastFinishedPulling="2025-11-29 05:39:05.494331448 +0000 UTC m=+669.734840667" observedRunningTime="2025-11-29 05:39:06.509733205 +0000 UTC m=+670.750242425" watchObservedRunningTime="2025-11-29 05:39:06.510902473 +0000 UTC m=+670.751411693" Nov 29 05:39:20 crc kubenswrapper[4594]: I1129 05:39:20.151952 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7667cbc88-mqfqp" Nov 29 05:39:39 crc kubenswrapper[4594]: I1129 05:39:39.785557 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-79b995c45-klk7s" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.315561 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-cmsgj"] Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.318205 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.319832 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7"] Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.320420 4594 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-nj74w" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.320419 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.320505 4594 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.321066 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.324149 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7"] Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.326170 4594 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.377893 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dd301093-7d62-4edf-8811-4f7529bba358-reloader\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.377968 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dd301093-7d62-4edf-8811-4f7529bba358-frr-startup\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.377991 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26zhz\" (UniqueName: \"kubernetes.io/projected/69e15172-74bf-4295-a7a9-a7843b1da728-kube-api-access-26zhz\") pod \"frr-k8s-webhook-server-7fcb986d4-9bwd7\" (UID: \"69e15172-74bf-4295-a7a9-a7843b1da728\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.378021 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dd301093-7d62-4edf-8811-4f7529bba358-metrics\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.378041 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj7ll\" (UniqueName: \"kubernetes.io/projected/dd301093-7d62-4edf-8811-4f7529bba358-kube-api-access-mj7ll\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.378059 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dd301093-7d62-4edf-8811-4f7529bba358-frr-conf\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.378076 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dd301093-7d62-4edf-8811-4f7529bba358-frr-sockets\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.378100 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69e15172-74bf-4295-a7a9-a7843b1da728-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-9bwd7\" (UID: \"69e15172-74bf-4295-a7a9-a7843b1da728\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.378123 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd301093-7d62-4edf-8811-4f7529bba358-metrics-certs\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.378076 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5fh4t"] Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.379043 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5fh4t" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.381515 4594 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.381582 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.381676 4594 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.389589 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-44rgs"] Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.390585 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-44rgs" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.390587 4594 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-n7hmk" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.395077 4594 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.403123 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-44rgs"] Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.479527 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dd301093-7d62-4edf-8811-4f7529bba358-frr-startup\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.479572 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26zhz\" (UniqueName: \"kubernetes.io/projected/69e15172-74bf-4295-a7a9-a7843b1da728-kube-api-access-26zhz\") pod \"frr-k8s-webhook-server-7fcb986d4-9bwd7\" (UID: \"69e15172-74bf-4295-a7a9-a7843b1da728\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.479604 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dd301093-7d62-4edf-8811-4f7529bba358-metrics\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.479996 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dd301093-7d62-4edf-8811-4f7529bba358-metrics\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.480216 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj7ll\" (UniqueName: \"kubernetes.io/projected/dd301093-7d62-4edf-8811-4f7529bba358-kube-api-access-mj7ll\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.480559 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dd301093-7d62-4edf-8811-4f7529bba358-frr-conf\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.480617 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dd301093-7d62-4edf-8811-4f7529bba358-frr-sockets\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.480673 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69e15172-74bf-4295-a7a9-a7843b1da728-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-9bwd7\" (UID: \"69e15172-74bf-4295-a7a9-a7843b1da728\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.480793 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dd301093-7d62-4edf-8811-4f7529bba358-frr-startup\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.480858 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd301093-7d62-4edf-8811-4f7529bba358-metrics-certs\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.481177 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dd301093-7d62-4edf-8811-4f7529bba358-frr-sockets\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.481170 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dd301093-7d62-4edf-8811-4f7529bba358-frr-conf\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.481301 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dd301093-7d62-4edf-8811-4f7529bba358-reloader\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.481670 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dd301093-7d62-4edf-8811-4f7529bba358-reloader\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.487566 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69e15172-74bf-4295-a7a9-a7843b1da728-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-9bwd7\" (UID: \"69e15172-74bf-4295-a7a9-a7843b1da728\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.487699 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd301093-7d62-4edf-8811-4f7529bba358-metrics-certs\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.493076 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26zhz\" (UniqueName: \"kubernetes.io/projected/69e15172-74bf-4295-a7a9-a7843b1da728-kube-api-access-26zhz\") pod \"frr-k8s-webhook-server-7fcb986d4-9bwd7\" (UID: \"69e15172-74bf-4295-a7a9-a7843b1da728\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.497737 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj7ll\" (UniqueName: \"kubernetes.io/projected/dd301093-7d62-4edf-8811-4f7529bba358-kube-api-access-mj7ll\") pod \"frr-k8s-cmsgj\" (UID: \"dd301093-7d62-4edf-8811-4f7529bba358\") " pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.582133 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-memberlist\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.582187 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c67ed7c7-a9cb-4068-80af-9356fd171e31-metrics-certs\") pod \"controller-f8648f98b-44rgs\" (UID: \"c67ed7c7-a9cb-4068-80af-9356fd171e31\") " pod="metallb-system/controller-f8648f98b-44rgs" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.582346 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-metrics-certs\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.582534 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c67ed7c7-a9cb-4068-80af-9356fd171e31-cert\") pod \"controller-f8648f98b-44rgs\" (UID: \"c67ed7c7-a9cb-4068-80af-9356fd171e31\") " pod="metallb-system/controller-f8648f98b-44rgs" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.582590 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqkzp\" (UniqueName: \"kubernetes.io/projected/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-kube-api-access-bqkzp\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.582640 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvgmv\" (UniqueName: \"kubernetes.io/projected/c67ed7c7-a9cb-4068-80af-9356fd171e31-kube-api-access-dvgmv\") pod \"controller-f8648f98b-44rgs\" (UID: \"c67ed7c7-a9cb-4068-80af-9356fd171e31\") " pod="metallb-system/controller-f8648f98b-44rgs" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.582656 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-metallb-excludel2\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.633742 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.638999 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.684150 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-metrics-certs\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.684293 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c67ed7c7-a9cb-4068-80af-9356fd171e31-cert\") pod \"controller-f8648f98b-44rgs\" (UID: \"c67ed7c7-a9cb-4068-80af-9356fd171e31\") " pod="metallb-system/controller-f8648f98b-44rgs" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.684341 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqkzp\" (UniqueName: \"kubernetes.io/projected/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-kube-api-access-bqkzp\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.684417 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvgmv\" (UniqueName: \"kubernetes.io/projected/c67ed7c7-a9cb-4068-80af-9356fd171e31-kube-api-access-dvgmv\") pod \"controller-f8648f98b-44rgs\" (UID: \"c67ed7c7-a9cb-4068-80af-9356fd171e31\") " pod="metallb-system/controller-f8648f98b-44rgs" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.684445 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-metallb-excludel2\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.684507 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-memberlist\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.684530 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c67ed7c7-a9cb-4068-80af-9356fd171e31-metrics-certs\") pod \"controller-f8648f98b-44rgs\" (UID: \"c67ed7c7-a9cb-4068-80af-9356fd171e31\") " pod="metallb-system/controller-f8648f98b-44rgs" Nov 29 05:39:40 crc kubenswrapper[4594]: E1129 05:39:40.685474 4594 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 29 05:39:40 crc kubenswrapper[4594]: E1129 05:39:40.685568 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-memberlist podName:a2f948fa-edac-4ac6-9ffb-e5ee886f8164 nodeName:}" failed. No retries permitted until 2025-11-29 05:39:41.185547078 +0000 UTC m=+705.426056299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-memberlist") pod "speaker-5fh4t" (UID: "a2f948fa-edac-4ac6-9ffb-e5ee886f8164") : secret "metallb-memberlist" not found Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.685991 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-metallb-excludel2\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.689290 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-metrics-certs\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.692209 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c67ed7c7-a9cb-4068-80af-9356fd171e31-metrics-certs\") pod \"controller-f8648f98b-44rgs\" (UID: \"c67ed7c7-a9cb-4068-80af-9356fd171e31\") " pod="metallb-system/controller-f8648f98b-44rgs" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.695509 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c67ed7c7-a9cb-4068-80af-9356fd171e31-cert\") pod \"controller-f8648f98b-44rgs\" (UID: \"c67ed7c7-a9cb-4068-80af-9356fd171e31\") " pod="metallb-system/controller-f8648f98b-44rgs" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.701461 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvgmv\" (UniqueName: \"kubernetes.io/projected/c67ed7c7-a9cb-4068-80af-9356fd171e31-kube-api-access-dvgmv\") pod \"controller-f8648f98b-44rgs\" (UID: \"c67ed7c7-a9cb-4068-80af-9356fd171e31\") " pod="metallb-system/controller-f8648f98b-44rgs" Nov 29 05:39:40 crc kubenswrapper[4594]: I1129 05:39:40.702160 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqkzp\" (UniqueName: \"kubernetes.io/projected/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-kube-api-access-bqkzp\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:41 crc kubenswrapper[4594]: I1129 05:39:41.002486 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-44rgs" Nov 29 05:39:41 crc kubenswrapper[4594]: I1129 05:39:41.050508 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7"] Nov 29 05:39:41 crc kubenswrapper[4594]: W1129 05:39:41.056247 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69e15172_74bf_4295_a7a9_a7843b1da728.slice/crio-bff6971167ac3de94696f8b543eb9b0a8562b54ddaa4467cfbd0c61d452633c1 WatchSource:0}: Error finding container bff6971167ac3de94696f8b543eb9b0a8562b54ddaa4467cfbd0c61d452633c1: Status 404 returned error can't find the container with id bff6971167ac3de94696f8b543eb9b0a8562b54ddaa4467cfbd0c61d452633c1 Nov 29 05:39:41 crc kubenswrapper[4594]: I1129 05:39:41.191781 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-memberlist\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:41 crc kubenswrapper[4594]: E1129 05:39:41.192086 4594 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 29 05:39:41 crc kubenswrapper[4594]: E1129 05:39:41.192164 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-memberlist podName:a2f948fa-edac-4ac6-9ffb-e5ee886f8164 nodeName:}" failed. No retries permitted until 2025-11-29 05:39:42.192139092 +0000 UTC m=+706.432648313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-memberlist") pod "speaker-5fh4t" (UID: "a2f948fa-edac-4ac6-9ffb-e5ee886f8164") : secret "metallb-memberlist" not found Nov 29 05:39:41 crc kubenswrapper[4594]: I1129 05:39:41.372357 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-44rgs"] Nov 29 05:39:41 crc kubenswrapper[4594]: W1129 05:39:41.377518 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc67ed7c7_a9cb_4068_80af_9356fd171e31.slice/crio-ba7bde9f9a961e1b17958bb340e5fdfd1d291a315f8d78d177dd9f8793c90fb5 WatchSource:0}: Error finding container ba7bde9f9a961e1b17958bb340e5fdfd1d291a315f8d78d177dd9f8793c90fb5: Status 404 returned error can't find the container with id ba7bde9f9a961e1b17958bb340e5fdfd1d291a315f8d78d177dd9f8793c90fb5 Nov 29 05:39:41 crc kubenswrapper[4594]: I1129 05:39:41.702635 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7" event={"ID":"69e15172-74bf-4295-a7a9-a7843b1da728","Type":"ContainerStarted","Data":"bff6971167ac3de94696f8b543eb9b0a8562b54ddaa4467cfbd0c61d452633c1"} Nov 29 05:39:41 crc kubenswrapper[4594]: I1129 05:39:41.704491 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-44rgs" event={"ID":"c67ed7c7-a9cb-4068-80af-9356fd171e31","Type":"ContainerStarted","Data":"1e6296e275a2fc67048728acb8b7ee1f729d1c492b1050f24fed075c30cbc655"} Nov 29 05:39:41 crc kubenswrapper[4594]: I1129 05:39:41.704520 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-44rgs" event={"ID":"c67ed7c7-a9cb-4068-80af-9356fd171e31","Type":"ContainerStarted","Data":"9b3a43a48de759a9317b827254eff02aa816fd45202bb0b520daaf9d797ea5c3"} Nov 29 05:39:41 crc kubenswrapper[4594]: I1129 05:39:41.704533 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-44rgs" event={"ID":"c67ed7c7-a9cb-4068-80af-9356fd171e31","Type":"ContainerStarted","Data":"ba7bde9f9a961e1b17958bb340e5fdfd1d291a315f8d78d177dd9f8793c90fb5"} Nov 29 05:39:41 crc kubenswrapper[4594]: I1129 05:39:41.704568 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-44rgs" Nov 29 05:39:41 crc kubenswrapper[4594]: I1129 05:39:41.705497 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmsgj" event={"ID":"dd301093-7d62-4edf-8811-4f7529bba358","Type":"ContainerStarted","Data":"51e1fdf174d2d6c44fe09ebeaaf0b8a3617e33263bb9a224b302b948922a0d8e"} Nov 29 05:39:41 crc kubenswrapper[4594]: I1129 05:39:41.721096 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-44rgs" podStartSLOduration=1.7210781480000001 podStartE2EDuration="1.721078148s" podCreationTimestamp="2025-11-29 05:39:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:39:41.718472117 +0000 UTC m=+705.958981337" watchObservedRunningTime="2025-11-29 05:39:41.721078148 +0000 UTC m=+705.961587368" Nov 29 05:39:42 crc kubenswrapper[4594]: I1129 05:39:42.205922 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-memberlist\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:42 crc kubenswrapper[4594]: I1129 05:39:42.218620 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a2f948fa-edac-4ac6-9ffb-e5ee886f8164-memberlist\") pod \"speaker-5fh4t\" (UID: \"a2f948fa-edac-4ac6-9ffb-e5ee886f8164\") " pod="metallb-system/speaker-5fh4t" Nov 29 05:39:42 crc kubenswrapper[4594]: I1129 05:39:42.491809 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5fh4t" Nov 29 05:39:42 crc kubenswrapper[4594]: I1129 05:39:42.715797 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5fh4t" event={"ID":"a2f948fa-edac-4ac6-9ffb-e5ee886f8164","Type":"ContainerStarted","Data":"5f133554cd47b37a4de571352483b12b8050ecdd0f48f8b9302091ea3ecfea6e"} Nov 29 05:39:42 crc kubenswrapper[4594]: I1129 05:39:42.716076 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5fh4t" event={"ID":"a2f948fa-edac-4ac6-9ffb-e5ee886f8164","Type":"ContainerStarted","Data":"7d1371a482750c0c25ee218f1e15c32c7afbfd540956e0c7b92ba0f89fca71b0"} Nov 29 05:39:43 crc kubenswrapper[4594]: I1129 05:39:43.726466 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5fh4t" event={"ID":"a2f948fa-edac-4ac6-9ffb-e5ee886f8164","Type":"ContainerStarted","Data":"c071ca0a2f50c2d623721237699d9169818a8d1d88f11e5f3671f01765109a7c"} Nov 29 05:39:43 crc kubenswrapper[4594]: I1129 05:39:43.727780 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-5fh4t" Nov 29 05:39:43 crc kubenswrapper[4594]: I1129 05:39:43.745289 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5fh4t" podStartSLOduration=3.745186032 podStartE2EDuration="3.745186032s" podCreationTimestamp="2025-11-29 05:39:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:39:43.742947492 +0000 UTC m=+707.983456713" watchObservedRunningTime="2025-11-29 05:39:43.745186032 +0000 UTC m=+707.985695252" Nov 29 05:39:47 crc kubenswrapper[4594]: I1129 05:39:47.753457 4594 generic.go:334] "Generic (PLEG): container finished" podID="dd301093-7d62-4edf-8811-4f7529bba358" containerID="d614a34b0714110de0ada00726fc2f9ae48b0957621207a863d9d1e4c60237ec" exitCode=0 Nov 29 05:39:47 crc kubenswrapper[4594]: I1129 05:39:47.753575 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmsgj" event={"ID":"dd301093-7d62-4edf-8811-4f7529bba358","Type":"ContainerDied","Data":"d614a34b0714110de0ada00726fc2f9ae48b0957621207a863d9d1e4c60237ec"} Nov 29 05:39:47 crc kubenswrapper[4594]: I1129 05:39:47.755329 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7" event={"ID":"69e15172-74bf-4295-a7a9-a7843b1da728","Type":"ContainerStarted","Data":"a7d045cf4e566f9be17bfc920ca69ae0cb8436aed5cb0117ba10e03c846fab84"} Nov 29 05:39:47 crc kubenswrapper[4594]: I1129 05:39:47.755695 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7" Nov 29 05:39:48 crc kubenswrapper[4594]: I1129 05:39:48.762675 4594 generic.go:334] "Generic (PLEG): container finished" podID="dd301093-7d62-4edf-8811-4f7529bba358" containerID="560dabe5bfc3b0e91df5902cb73245638c1505a3cde3759a089d71a3deeee361" exitCode=0 Nov 29 05:39:48 crc kubenswrapper[4594]: I1129 05:39:48.762773 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmsgj" event={"ID":"dd301093-7d62-4edf-8811-4f7529bba358","Type":"ContainerDied","Data":"560dabe5bfc3b0e91df5902cb73245638c1505a3cde3759a089d71a3deeee361"} Nov 29 05:39:48 crc kubenswrapper[4594]: I1129 05:39:48.780476 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7" podStartSLOduration=2.45506841 podStartE2EDuration="8.780461955s" podCreationTimestamp="2025-11-29 05:39:40 +0000 UTC" firstStartedPulling="2025-11-29 05:39:41.058117583 +0000 UTC m=+705.298626803" lastFinishedPulling="2025-11-29 05:39:47.383511128 +0000 UTC m=+711.624020348" observedRunningTime="2025-11-29 05:39:47.788382683 +0000 UTC m=+712.028891903" watchObservedRunningTime="2025-11-29 05:39:48.780461955 +0000 UTC m=+713.020971175" Nov 29 05:39:49 crc kubenswrapper[4594]: I1129 05:39:49.770035 4594 generic.go:334] "Generic (PLEG): container finished" podID="dd301093-7d62-4edf-8811-4f7529bba358" containerID="b43237bce4b76c3d6307a24638a091657a80afe6415cd3bcae6ae88c80ab92b1" exitCode=0 Nov 29 05:39:49 crc kubenswrapper[4594]: I1129 05:39:49.770401 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmsgj" event={"ID":"dd301093-7d62-4edf-8811-4f7529bba358","Type":"ContainerDied","Data":"b43237bce4b76c3d6307a24638a091657a80afe6415cd3bcae6ae88c80ab92b1"} Nov 29 05:39:50 crc kubenswrapper[4594]: I1129 05:39:50.780398 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmsgj" event={"ID":"dd301093-7d62-4edf-8811-4f7529bba358","Type":"ContainerStarted","Data":"4695355e56b15095b755623c9f950d69772a2349e7c6b4d72992a0fda4d2dee7"} Nov 29 05:39:50 crc kubenswrapper[4594]: I1129 05:39:50.780441 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmsgj" event={"ID":"dd301093-7d62-4edf-8811-4f7529bba358","Type":"ContainerStarted","Data":"750c0f379edae8bec8288d5cf3a33ab8c887421e8415fe4509ebbfda2539cc22"} Nov 29 05:39:50 crc kubenswrapper[4594]: I1129 05:39:50.780451 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmsgj" event={"ID":"dd301093-7d62-4edf-8811-4f7529bba358","Type":"ContainerStarted","Data":"271b56ecca60d1c637340cc814a6b3ddffb7a28986d043b3a57f3ae6f6229488"} Nov 29 05:39:50 crc kubenswrapper[4594]: I1129 05:39:50.780460 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmsgj" event={"ID":"dd301093-7d62-4edf-8811-4f7529bba358","Type":"ContainerStarted","Data":"09767d353c6c635ce2acc0355c02be7e279454ac0ba273c8e506d454174ee1f1"} Nov 29 05:39:50 crc kubenswrapper[4594]: I1129 05:39:50.780469 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmsgj" event={"ID":"dd301093-7d62-4edf-8811-4f7529bba358","Type":"ContainerStarted","Data":"259aa5559504080368b16197d3de0f3c79947974ddb3f1cd13d21cab8a7bf4ad"} Nov 29 05:39:50 crc kubenswrapper[4594]: I1129 05:39:50.780477 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmsgj" event={"ID":"dd301093-7d62-4edf-8811-4f7529bba358","Type":"ContainerStarted","Data":"be21e31fd0cf67d3a95c16cc7fc5075f6db74754cd942c3f122b413c1feae1e5"} Nov 29 05:39:50 crc kubenswrapper[4594]: I1129 05:39:50.780563 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:50 crc kubenswrapper[4594]: I1129 05:39:50.802456 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-cmsgj" podStartSLOduration=4.188860433 podStartE2EDuration="10.802443721s" podCreationTimestamp="2025-11-29 05:39:40 +0000 UTC" firstStartedPulling="2025-11-29 05:39:40.774591591 +0000 UTC m=+705.015100800" lastFinishedPulling="2025-11-29 05:39:47.388174868 +0000 UTC m=+711.628684088" observedRunningTime="2025-11-29 05:39:50.798654526 +0000 UTC m=+715.039163746" watchObservedRunningTime="2025-11-29 05:39:50.802443721 +0000 UTC m=+715.042952931" Nov 29 05:39:51 crc kubenswrapper[4594]: I1129 05:39:51.008004 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-44rgs" Nov 29 05:39:52 crc kubenswrapper[4594]: I1129 05:39:52.494313 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5fh4t" Nov 29 05:39:54 crc kubenswrapper[4594]: I1129 05:39:54.551506 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-bb8bs"] Nov 29 05:39:54 crc kubenswrapper[4594]: I1129 05:39:54.552411 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bb8bs" Nov 29 05:39:54 crc kubenswrapper[4594]: I1129 05:39:54.553714 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 29 05:39:54 crc kubenswrapper[4594]: I1129 05:39:54.554383 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-5468l" Nov 29 05:39:54 crc kubenswrapper[4594]: I1129 05:39:54.557957 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 29 05:39:54 crc kubenswrapper[4594]: I1129 05:39:54.562046 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bb8bs"] Nov 29 05:39:54 crc kubenswrapper[4594]: I1129 05:39:54.570244 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbsxq\" (UniqueName: \"kubernetes.io/projected/441f4451-11b1-4c3d-9a5a-d4e8987cc230-kube-api-access-xbsxq\") pod \"openstack-operator-index-bb8bs\" (UID: \"441f4451-11b1-4c3d-9a5a-d4e8987cc230\") " pod="openstack-operators/openstack-operator-index-bb8bs" Nov 29 05:39:54 crc kubenswrapper[4594]: I1129 05:39:54.671155 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbsxq\" (UniqueName: \"kubernetes.io/projected/441f4451-11b1-4c3d-9a5a-d4e8987cc230-kube-api-access-xbsxq\") pod \"openstack-operator-index-bb8bs\" (UID: \"441f4451-11b1-4c3d-9a5a-d4e8987cc230\") " pod="openstack-operators/openstack-operator-index-bb8bs" Nov 29 05:39:54 crc kubenswrapper[4594]: I1129 05:39:54.686970 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbsxq\" (UniqueName: \"kubernetes.io/projected/441f4451-11b1-4c3d-9a5a-d4e8987cc230-kube-api-access-xbsxq\") pod \"openstack-operator-index-bb8bs\" (UID: \"441f4451-11b1-4c3d-9a5a-d4e8987cc230\") " pod="openstack-operators/openstack-operator-index-bb8bs" Nov 29 05:39:54 crc kubenswrapper[4594]: I1129 05:39:54.866078 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bb8bs" Nov 29 05:39:55 crc kubenswrapper[4594]: I1129 05:39:55.210242 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bb8bs"] Nov 29 05:39:55 crc kubenswrapper[4594]: I1129 05:39:55.634608 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:55 crc kubenswrapper[4594]: I1129 05:39:55.662618 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:39:55 crc kubenswrapper[4594]: I1129 05:39:55.806228 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bb8bs" event={"ID":"441f4451-11b1-4c3d-9a5a-d4e8987cc230","Type":"ContainerStarted","Data":"c129ae87ad0949d3fa7bab53211d463d8b7166e86cdcc5c0649520374a50c8d0"} Nov 29 05:39:57 crc kubenswrapper[4594]: I1129 05:39:57.817643 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bb8bs" event={"ID":"441f4451-11b1-4c3d-9a5a-d4e8987cc230","Type":"ContainerStarted","Data":"b9cf02cff1d4fffc2cf836865bfde9701553d578a792320b59d7c22fe663126b"} Nov 29 05:39:57 crc kubenswrapper[4594]: I1129 05:39:57.830411 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-bb8bs" podStartSLOduration=2.020175835 podStartE2EDuration="3.830387409s" podCreationTimestamp="2025-11-29 05:39:54 +0000 UTC" firstStartedPulling="2025-11-29 05:39:55.216528448 +0000 UTC m=+719.457037668" lastFinishedPulling="2025-11-29 05:39:57.026740022 +0000 UTC m=+721.267249242" observedRunningTime="2025-11-29 05:39:57.82747936 +0000 UTC m=+722.067988581" watchObservedRunningTime="2025-11-29 05:39:57.830387409 +0000 UTC m=+722.070896628" Nov 29 05:39:57 crc kubenswrapper[4594]: I1129 05:39:57.937800 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-bb8bs"] Nov 29 05:39:58 crc kubenswrapper[4594]: I1129 05:39:58.542898 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zmbtt"] Nov 29 05:39:58 crc kubenswrapper[4594]: I1129 05:39:58.543617 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zmbtt" Nov 29 05:39:58 crc kubenswrapper[4594]: I1129 05:39:58.550186 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zmbtt"] Nov 29 05:39:58 crc kubenswrapper[4594]: I1129 05:39:58.615047 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnhgh\" (UniqueName: \"kubernetes.io/projected/9d8ef423-1563-4fda-92d3-dcbd15f10b13-kube-api-access-nnhgh\") pod \"openstack-operator-index-zmbtt\" (UID: \"9d8ef423-1563-4fda-92d3-dcbd15f10b13\") " pod="openstack-operators/openstack-operator-index-zmbtt" Nov 29 05:39:58 crc kubenswrapper[4594]: I1129 05:39:58.716328 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnhgh\" (UniqueName: \"kubernetes.io/projected/9d8ef423-1563-4fda-92d3-dcbd15f10b13-kube-api-access-nnhgh\") pod \"openstack-operator-index-zmbtt\" (UID: \"9d8ef423-1563-4fda-92d3-dcbd15f10b13\") " pod="openstack-operators/openstack-operator-index-zmbtt" Nov 29 05:39:58 crc kubenswrapper[4594]: I1129 05:39:58.731295 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnhgh\" (UniqueName: \"kubernetes.io/projected/9d8ef423-1563-4fda-92d3-dcbd15f10b13-kube-api-access-nnhgh\") pod \"openstack-operator-index-zmbtt\" (UID: \"9d8ef423-1563-4fda-92d3-dcbd15f10b13\") " pod="openstack-operators/openstack-operator-index-zmbtt" Nov 29 05:39:58 crc kubenswrapper[4594]: I1129 05:39:58.858662 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zmbtt" Nov 29 05:39:59 crc kubenswrapper[4594]: I1129 05:39:59.198749 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zmbtt"] Nov 29 05:39:59 crc kubenswrapper[4594]: W1129 05:39:59.202299 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d8ef423_1563_4fda_92d3_dcbd15f10b13.slice/crio-4744ff8f3767f5e7a537d3a8faa9a86e29f857a2adeb7f7821f048a2a269162f WatchSource:0}: Error finding container 4744ff8f3767f5e7a537d3a8faa9a86e29f857a2adeb7f7821f048a2a269162f: Status 404 returned error can't find the container with id 4744ff8f3767f5e7a537d3a8faa9a86e29f857a2adeb7f7821f048a2a269162f Nov 29 05:39:59 crc kubenswrapper[4594]: I1129 05:39:59.829160 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zmbtt" event={"ID":"9d8ef423-1563-4fda-92d3-dcbd15f10b13","Type":"ContainerStarted","Data":"4744ff8f3767f5e7a537d3a8faa9a86e29f857a2adeb7f7821f048a2a269162f"} Nov 29 05:39:59 crc kubenswrapper[4594]: I1129 05:39:59.829293 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-bb8bs" podUID="441f4451-11b1-4c3d-9a5a-d4e8987cc230" containerName="registry-server" containerID="cri-o://b9cf02cff1d4fffc2cf836865bfde9701553d578a792320b59d7c22fe663126b" gracePeriod=2 Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.140728 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bb8bs" Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.236810 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbsxq\" (UniqueName: \"kubernetes.io/projected/441f4451-11b1-4c3d-9a5a-d4e8987cc230-kube-api-access-xbsxq\") pod \"441f4451-11b1-4c3d-9a5a-d4e8987cc230\" (UID: \"441f4451-11b1-4c3d-9a5a-d4e8987cc230\") " Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.241159 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/441f4451-11b1-4c3d-9a5a-d4e8987cc230-kube-api-access-xbsxq" (OuterVolumeSpecName: "kube-api-access-xbsxq") pod "441f4451-11b1-4c3d-9a5a-d4e8987cc230" (UID: "441f4451-11b1-4c3d-9a5a-d4e8987cc230"). InnerVolumeSpecName "kube-api-access-xbsxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.339361 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbsxq\" (UniqueName: \"kubernetes.io/projected/441f4451-11b1-4c3d-9a5a-d4e8987cc230-kube-api-access-xbsxq\") on node \"crc\" DevicePath \"\"" Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.635855 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-cmsgj" Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.643854 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9bwd7" Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.836423 4594 generic.go:334] "Generic (PLEG): container finished" podID="441f4451-11b1-4c3d-9a5a-d4e8987cc230" containerID="b9cf02cff1d4fffc2cf836865bfde9701553d578a792320b59d7c22fe663126b" exitCode=0 Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.836464 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bb8bs" Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.836482 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bb8bs" event={"ID":"441f4451-11b1-4c3d-9a5a-d4e8987cc230","Type":"ContainerDied","Data":"b9cf02cff1d4fffc2cf836865bfde9701553d578a792320b59d7c22fe663126b"} Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.836537 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bb8bs" event={"ID":"441f4451-11b1-4c3d-9a5a-d4e8987cc230","Type":"ContainerDied","Data":"c129ae87ad0949d3fa7bab53211d463d8b7166e86cdcc5c0649520374a50c8d0"} Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.836558 4594 scope.go:117] "RemoveContainer" containerID="b9cf02cff1d4fffc2cf836865bfde9701553d578a792320b59d7c22fe663126b" Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.838089 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zmbtt" event={"ID":"9d8ef423-1563-4fda-92d3-dcbd15f10b13","Type":"ContainerStarted","Data":"c0f1d802366f6abb1f18eb5a5091f65d95936797f8f64868e5daebe8d9f6fe53"} Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.848481 4594 scope.go:117] "RemoveContainer" containerID="b9cf02cff1d4fffc2cf836865bfde9701553d578a792320b59d7c22fe663126b" Nov 29 05:40:00 crc kubenswrapper[4594]: E1129 05:40:00.848742 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9cf02cff1d4fffc2cf836865bfde9701553d578a792320b59d7c22fe663126b\": container with ID starting with b9cf02cff1d4fffc2cf836865bfde9701553d578a792320b59d7c22fe663126b not found: ID does not exist" containerID="b9cf02cff1d4fffc2cf836865bfde9701553d578a792320b59d7c22fe663126b" Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.848793 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cf02cff1d4fffc2cf836865bfde9701553d578a792320b59d7c22fe663126b"} err="failed to get container status \"b9cf02cff1d4fffc2cf836865bfde9701553d578a792320b59d7c22fe663126b\": rpc error: code = NotFound desc = could not find container \"b9cf02cff1d4fffc2cf836865bfde9701553d578a792320b59d7c22fe663126b\": container with ID starting with b9cf02cff1d4fffc2cf836865bfde9701553d578a792320b59d7c22fe663126b not found: ID does not exist" Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.850587 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zmbtt" podStartSLOduration=2.074802586 podStartE2EDuration="2.850578877s" podCreationTimestamp="2025-11-29 05:39:58 +0000 UTC" firstStartedPulling="2025-11-29 05:39:59.204051895 +0000 UTC m=+723.444561115" lastFinishedPulling="2025-11-29 05:39:59.979828187 +0000 UTC m=+724.220337406" observedRunningTime="2025-11-29 05:40:00.850037478 +0000 UTC m=+725.090546698" watchObservedRunningTime="2025-11-29 05:40:00.850578877 +0000 UTC m=+725.091088096" Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.860740 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-bb8bs"] Nov 29 05:40:00 crc kubenswrapper[4594]: I1129 05:40:00.864179 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-bb8bs"] Nov 29 05:40:02 crc kubenswrapper[4594]: I1129 05:40:02.089350 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="441f4451-11b1-4c3d-9a5a-d4e8987cc230" path="/var/lib/kubelet/pods/441f4451-11b1-4c3d-9a5a-d4e8987cc230/volumes" Nov 29 05:40:08 crc kubenswrapper[4594]: I1129 05:40:08.858851 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-zmbtt" Nov 29 05:40:08 crc kubenswrapper[4594]: I1129 05:40:08.859074 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-zmbtt" Nov 29 05:40:08 crc kubenswrapper[4594]: I1129 05:40:08.880187 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-zmbtt" Nov 29 05:40:08 crc kubenswrapper[4594]: I1129 05:40:08.900487 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-zmbtt" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.373958 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm"] Nov 29 05:40:22 crc kubenswrapper[4594]: E1129 05:40:22.374616 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441f4451-11b1-4c3d-9a5a-d4e8987cc230" containerName="registry-server" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.374630 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="441f4451-11b1-4c3d-9a5a-d4e8987cc230" containerName="registry-server" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.374731 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="441f4451-11b1-4c3d-9a5a-d4e8987cc230" containerName="registry-server" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.386646 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.389976 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vksqd" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.390720 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm"] Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.397750 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/605f6125-2bfd-43a1-b01b-4ea4f391b981-bundle\") pod \"34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm\" (UID: \"605f6125-2bfd-43a1-b01b-4ea4f391b981\") " pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.397789 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/605f6125-2bfd-43a1-b01b-4ea4f391b981-util\") pod \"34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm\" (UID: \"605f6125-2bfd-43a1-b01b-4ea4f391b981\") " pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.397829 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq22g\" (UniqueName: \"kubernetes.io/projected/605f6125-2bfd-43a1-b01b-4ea4f391b981-kube-api-access-cq22g\") pod \"34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm\" (UID: \"605f6125-2bfd-43a1-b01b-4ea4f391b981\") " pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.498445 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq22g\" (UniqueName: \"kubernetes.io/projected/605f6125-2bfd-43a1-b01b-4ea4f391b981-kube-api-access-cq22g\") pod \"34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm\" (UID: \"605f6125-2bfd-43a1-b01b-4ea4f391b981\") " pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.498700 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/605f6125-2bfd-43a1-b01b-4ea4f391b981-bundle\") pod \"34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm\" (UID: \"605f6125-2bfd-43a1-b01b-4ea4f391b981\") " pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.498736 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/605f6125-2bfd-43a1-b01b-4ea4f391b981-util\") pod \"34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm\" (UID: \"605f6125-2bfd-43a1-b01b-4ea4f391b981\") " pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.499077 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/605f6125-2bfd-43a1-b01b-4ea4f391b981-bundle\") pod \"34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm\" (UID: \"605f6125-2bfd-43a1-b01b-4ea4f391b981\") " pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.499099 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/605f6125-2bfd-43a1-b01b-4ea4f391b981-util\") pod \"34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm\" (UID: \"605f6125-2bfd-43a1-b01b-4ea4f391b981\") " pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.513787 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq22g\" (UniqueName: \"kubernetes.io/projected/605f6125-2bfd-43a1-b01b-4ea4f391b981-kube-api-access-cq22g\") pod \"34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm\" (UID: \"605f6125-2bfd-43a1-b01b-4ea4f391b981\") " pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" Nov 29 05:40:22 crc kubenswrapper[4594]: I1129 05:40:22.701246 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" Nov 29 05:40:23 crc kubenswrapper[4594]: I1129 05:40:23.054105 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm"] Nov 29 05:40:23 crc kubenswrapper[4594]: I1129 05:40:23.954539 4594 generic.go:334] "Generic (PLEG): container finished" podID="605f6125-2bfd-43a1-b01b-4ea4f391b981" containerID="dadcce475dd0c1516f937fc4333df938862ff8eb1692d717a9f6bd4df826a3e2" exitCode=0 Nov 29 05:40:23 crc kubenswrapper[4594]: I1129 05:40:23.954608 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" event={"ID":"605f6125-2bfd-43a1-b01b-4ea4f391b981","Type":"ContainerDied","Data":"dadcce475dd0c1516f937fc4333df938862ff8eb1692d717a9f6bd4df826a3e2"} Nov 29 05:40:23 crc kubenswrapper[4594]: I1129 05:40:23.954761 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" event={"ID":"605f6125-2bfd-43a1-b01b-4ea4f391b981","Type":"ContainerStarted","Data":"5a0d9364e5dbbdec30fffdd158070e901fc571bfb4b0ef6080efa47108bf49a5"} Nov 29 05:40:25 crc kubenswrapper[4594]: I1129 05:40:25.968298 4594 generic.go:334] "Generic (PLEG): container finished" podID="605f6125-2bfd-43a1-b01b-4ea4f391b981" containerID="15fd663d615f286550931fb9c29b5d81ee72a831f095795c7bd600fb9ad0aba5" exitCode=0 Nov 29 05:40:25 crc kubenswrapper[4594]: I1129 05:40:25.968666 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" event={"ID":"605f6125-2bfd-43a1-b01b-4ea4f391b981","Type":"ContainerDied","Data":"15fd663d615f286550931fb9c29b5d81ee72a831f095795c7bd600fb9ad0aba5"} Nov 29 05:40:26 crc kubenswrapper[4594]: I1129 05:40:26.977110 4594 generic.go:334] "Generic (PLEG): container finished" podID="605f6125-2bfd-43a1-b01b-4ea4f391b981" containerID="a44c0e2e579396b3a70b07381dc6107ff4a2e8abf48ba66e4898dc14e9559532" exitCode=0 Nov 29 05:40:26 crc kubenswrapper[4594]: I1129 05:40:26.977162 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" event={"ID":"605f6125-2bfd-43a1-b01b-4ea4f391b981","Type":"ContainerDied","Data":"a44c0e2e579396b3a70b07381dc6107ff4a2e8abf48ba66e4898dc14e9559532"} Nov 29 05:40:28 crc kubenswrapper[4594]: I1129 05:40:28.176360 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" Nov 29 05:40:28 crc kubenswrapper[4594]: I1129 05:40:28.366354 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/605f6125-2bfd-43a1-b01b-4ea4f391b981-bundle\") pod \"605f6125-2bfd-43a1-b01b-4ea4f391b981\" (UID: \"605f6125-2bfd-43a1-b01b-4ea4f391b981\") " Nov 29 05:40:28 crc kubenswrapper[4594]: I1129 05:40:28.366457 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/605f6125-2bfd-43a1-b01b-4ea4f391b981-util\") pod \"605f6125-2bfd-43a1-b01b-4ea4f391b981\" (UID: \"605f6125-2bfd-43a1-b01b-4ea4f391b981\") " Nov 29 05:40:28 crc kubenswrapper[4594]: I1129 05:40:28.366487 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq22g\" (UniqueName: \"kubernetes.io/projected/605f6125-2bfd-43a1-b01b-4ea4f391b981-kube-api-access-cq22g\") pod \"605f6125-2bfd-43a1-b01b-4ea4f391b981\" (UID: \"605f6125-2bfd-43a1-b01b-4ea4f391b981\") " Nov 29 05:40:28 crc kubenswrapper[4594]: I1129 05:40:28.367161 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/605f6125-2bfd-43a1-b01b-4ea4f391b981-bundle" (OuterVolumeSpecName: "bundle") pod "605f6125-2bfd-43a1-b01b-4ea4f391b981" (UID: "605f6125-2bfd-43a1-b01b-4ea4f391b981"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:40:28 crc kubenswrapper[4594]: I1129 05:40:28.370943 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/605f6125-2bfd-43a1-b01b-4ea4f391b981-kube-api-access-cq22g" (OuterVolumeSpecName: "kube-api-access-cq22g") pod "605f6125-2bfd-43a1-b01b-4ea4f391b981" (UID: "605f6125-2bfd-43a1-b01b-4ea4f391b981"). InnerVolumeSpecName "kube-api-access-cq22g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:40:28 crc kubenswrapper[4594]: I1129 05:40:28.376450 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/605f6125-2bfd-43a1-b01b-4ea4f391b981-util" (OuterVolumeSpecName: "util") pod "605f6125-2bfd-43a1-b01b-4ea4f391b981" (UID: "605f6125-2bfd-43a1-b01b-4ea4f391b981"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:40:28 crc kubenswrapper[4594]: I1129 05:40:28.467921 4594 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/605f6125-2bfd-43a1-b01b-4ea4f391b981-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:40:28 crc kubenswrapper[4594]: I1129 05:40:28.467948 4594 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/605f6125-2bfd-43a1-b01b-4ea4f391b981-util\") on node \"crc\" DevicePath \"\"" Nov 29 05:40:28 crc kubenswrapper[4594]: I1129 05:40:28.467956 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq22g\" (UniqueName: \"kubernetes.io/projected/605f6125-2bfd-43a1-b01b-4ea4f391b981-kube-api-access-cq22g\") on node \"crc\" DevicePath \"\"" Nov 29 05:40:28 crc kubenswrapper[4594]: I1129 05:40:28.988965 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" event={"ID":"605f6125-2bfd-43a1-b01b-4ea4f391b981","Type":"ContainerDied","Data":"5a0d9364e5dbbdec30fffdd158070e901fc571bfb4b0ef6080efa47108bf49a5"} Nov 29 05:40:28 crc kubenswrapper[4594]: I1129 05:40:28.989005 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a0d9364e5dbbdec30fffdd158070e901fc571bfb4b0ef6080efa47108bf49a5" Nov 29 05:40:28 crc kubenswrapper[4594]: I1129 05:40:28.989005 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm" Nov 29 05:40:32 crc kubenswrapper[4594]: I1129 05:40:32.748019 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6ddddd9d6f-7q4j2"] Nov 29 05:40:32 crc kubenswrapper[4594]: E1129 05:40:32.748459 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605f6125-2bfd-43a1-b01b-4ea4f391b981" containerName="util" Nov 29 05:40:32 crc kubenswrapper[4594]: I1129 05:40:32.748472 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="605f6125-2bfd-43a1-b01b-4ea4f391b981" containerName="util" Nov 29 05:40:32 crc kubenswrapper[4594]: E1129 05:40:32.748490 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605f6125-2bfd-43a1-b01b-4ea4f391b981" containerName="extract" Nov 29 05:40:32 crc kubenswrapper[4594]: I1129 05:40:32.748495 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="605f6125-2bfd-43a1-b01b-4ea4f391b981" containerName="extract" Nov 29 05:40:32 crc kubenswrapper[4594]: E1129 05:40:32.748506 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605f6125-2bfd-43a1-b01b-4ea4f391b981" containerName="pull" Nov 29 05:40:32 crc kubenswrapper[4594]: I1129 05:40:32.748511 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="605f6125-2bfd-43a1-b01b-4ea4f391b981" containerName="pull" Nov 29 05:40:32 crc kubenswrapper[4594]: I1129 05:40:32.748615 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="605f6125-2bfd-43a1-b01b-4ea4f391b981" containerName="extract" Nov 29 05:40:32 crc kubenswrapper[4594]: I1129 05:40:32.749007 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6ddddd9d6f-7q4j2" Nov 29 05:40:32 crc kubenswrapper[4594]: I1129 05:40:32.750374 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-4v4bk" Nov 29 05:40:32 crc kubenswrapper[4594]: I1129 05:40:32.767934 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6ddddd9d6f-7q4j2"] Nov 29 05:40:32 crc kubenswrapper[4594]: I1129 05:40:32.817820 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42kc2\" (UniqueName: \"kubernetes.io/projected/880b9d6a-5dc6-448b-a63c-b098fcc54023-kube-api-access-42kc2\") pod \"openstack-operator-controller-operator-6ddddd9d6f-7q4j2\" (UID: \"880b9d6a-5dc6-448b-a63c-b098fcc54023\") " pod="openstack-operators/openstack-operator-controller-operator-6ddddd9d6f-7q4j2" Nov 29 05:40:32 crc kubenswrapper[4594]: I1129 05:40:32.918955 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42kc2\" (UniqueName: \"kubernetes.io/projected/880b9d6a-5dc6-448b-a63c-b098fcc54023-kube-api-access-42kc2\") pod \"openstack-operator-controller-operator-6ddddd9d6f-7q4j2\" (UID: \"880b9d6a-5dc6-448b-a63c-b098fcc54023\") " pod="openstack-operators/openstack-operator-controller-operator-6ddddd9d6f-7q4j2" Nov 29 05:40:32 crc kubenswrapper[4594]: I1129 05:40:32.936035 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42kc2\" (UniqueName: \"kubernetes.io/projected/880b9d6a-5dc6-448b-a63c-b098fcc54023-kube-api-access-42kc2\") pod \"openstack-operator-controller-operator-6ddddd9d6f-7q4j2\" (UID: \"880b9d6a-5dc6-448b-a63c-b098fcc54023\") " pod="openstack-operators/openstack-operator-controller-operator-6ddddd9d6f-7q4j2" Nov 29 05:40:33 crc kubenswrapper[4594]: I1129 05:40:33.062692 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6ddddd9d6f-7q4j2" Nov 29 05:40:33 crc kubenswrapper[4594]: I1129 05:40:33.434547 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6ddddd9d6f-7q4j2"] Nov 29 05:40:33 crc kubenswrapper[4594]: I1129 05:40:33.574809 4594 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 29 05:40:34 crc kubenswrapper[4594]: I1129 05:40:34.015672 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6ddddd9d6f-7q4j2" event={"ID":"880b9d6a-5dc6-448b-a63c-b098fcc54023","Type":"ContainerStarted","Data":"f5dc323beee8f267d1b106d720fe99def0a3445100e6399ff362a3b0987458fb"} Nov 29 05:40:39 crc kubenswrapper[4594]: I1129 05:40:39.046975 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6ddddd9d6f-7q4j2" event={"ID":"880b9d6a-5dc6-448b-a63c-b098fcc54023","Type":"ContainerStarted","Data":"6da28948021ba27cd50e2dc37d30c4bd1593cf4aa0115abf3fff7391867dda86"} Nov 29 05:40:39 crc kubenswrapper[4594]: I1129 05:40:39.047611 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6ddddd9d6f-7q4j2" Nov 29 05:40:39 crc kubenswrapper[4594]: I1129 05:40:39.076766 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6ddddd9d6f-7q4j2" podStartSLOduration=2.527083642 podStartE2EDuration="7.076751464s" podCreationTimestamp="2025-11-29 05:40:32 +0000 UTC" firstStartedPulling="2025-11-29 05:40:33.437031566 +0000 UTC m=+757.677540786" lastFinishedPulling="2025-11-29 05:40:37.986699388 +0000 UTC m=+762.227208608" observedRunningTime="2025-11-29 05:40:39.072018345 +0000 UTC m=+763.312527565" watchObservedRunningTime="2025-11-29 05:40:39.076751464 +0000 UTC m=+763.317260685" Nov 29 05:40:43 crc kubenswrapper[4594]: I1129 05:40:43.066658 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6ddddd9d6f-7q4j2" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.521438 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.522860 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.524219 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ztzbz" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.530241 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.541871 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.542961 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.545028 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-9tx9m" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.552281 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.553237 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.554586 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-ddmhl" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.558691 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.561801 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.571079 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.572008 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.577053 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-s6hgt" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.585015 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.586012 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.587885 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-tmp4n" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.590706 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mppjv\" (UniqueName: \"kubernetes.io/projected/e564c6c7-7145-411f-b48f-d8e2594c34a5-kube-api-access-mppjv\") pod \"glance-operator-controller-manager-668d9c48b9-s254f\" (UID: \"e564c6c7-7145-411f-b48f-d8e2594c34a5\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.590763 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt98s\" (UniqueName: \"kubernetes.io/projected/f50eb91c-7f6c-4d0f-b32a-c9ead1766b9c-kube-api-access-kt98s\") pod \"cinder-operator-controller-manager-859b6ccc6-xlmc8\" (UID: \"f50eb91c-7f6c-4d0f-b32a-c9ead1766b9c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.590832 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ckwq\" (UniqueName: \"kubernetes.io/projected/95648e23-46bf-4160-9527-7ad1c84f9883-kube-api-access-5ckwq\") pod \"designate-operator-controller-manager-78b4bc895b-g6nl9\" (UID: \"95648e23-46bf-4160-9527-7ad1c84f9883\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.590863 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg4wl\" (UniqueName: \"kubernetes.io/projected/8c96e02a-d3bd-4904-8ade-baecb4c3a280-kube-api-access-xg4wl\") pod \"barbican-operator-controller-manager-7d9dfd778-jc4r5\" (UID: \"8c96e02a-d3bd-4904-8ade-baecb4c3a280\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.599799 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.606214 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.612727 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.613772 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.616075 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-gvwtj" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.618290 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.619276 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.622449 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-pwhks" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.622538 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.624466 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.634004 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.637298 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.638124 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.639933 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-sb6hc" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.642364 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.643580 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.645654 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-mcnxq" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.653622 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.692935 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt98s\" (UniqueName: \"kubernetes.io/projected/f50eb91c-7f6c-4d0f-b32a-c9ead1766b9c-kube-api-access-kt98s\") pod \"cinder-operator-controller-manager-859b6ccc6-xlmc8\" (UID: \"f50eb91c-7f6c-4d0f-b32a-c9ead1766b9c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.692986 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert\") pod \"infra-operator-controller-manager-57548d458d-6c7gr\" (UID: \"0ee70d2e-b283-468b-8bd8-016a120b5ae8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.693034 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlbgx\" (UniqueName: \"kubernetes.io/projected/09d8a0a7-cc55-4654-8e59-a769c806eecf-kube-api-access-qlbgx\") pod \"horizon-operator-controller-manager-68c6d99b8f-z9q2v\" (UID: \"09d8a0a7-cc55-4654-8e59-a769c806eecf\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.693056 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8nnx\" (UniqueName: \"kubernetes.io/projected/01841a60-a638-4a78-84d4-01ad474bf2fb-kube-api-access-g8nnx\") pod \"heat-operator-controller-manager-5f64f6f8bb-mcfvv\" (UID: \"01841a60-a638-4a78-84d4-01ad474bf2fb\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.693091 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjgxt\" (UniqueName: \"kubernetes.io/projected/0ee70d2e-b283-468b-8bd8-016a120b5ae8-kube-api-access-gjgxt\") pod \"infra-operator-controller-manager-57548d458d-6c7gr\" (UID: \"0ee70d2e-b283-468b-8bd8-016a120b5ae8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.693141 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ckwq\" (UniqueName: \"kubernetes.io/projected/95648e23-46bf-4160-9527-7ad1c84f9883-kube-api-access-5ckwq\") pod \"designate-operator-controller-manager-78b4bc895b-g6nl9\" (UID: \"95648e23-46bf-4160-9527-7ad1c84f9883\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.693174 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69knt\" (UniqueName: \"kubernetes.io/projected/fbdf482e-3aa0-4f5c-a698-949ad6cb6992-kube-api-access-69knt\") pod \"keystone-operator-controller-manager-546d4bdf48-ppdt9\" (UID: \"fbdf482e-3aa0-4f5c-a698-949ad6cb6992\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.693204 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg4wl\" (UniqueName: \"kubernetes.io/projected/8c96e02a-d3bd-4904-8ade-baecb4c3a280-kube-api-access-xg4wl\") pod \"barbican-operator-controller-manager-7d9dfd778-jc4r5\" (UID: \"8c96e02a-d3bd-4904-8ade-baecb4c3a280\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.693244 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sd9s\" (UniqueName: \"kubernetes.io/projected/fb42365c-18e1-4456-ae16-be77a16f102c-kube-api-access-9sd9s\") pod \"ironic-operator-controller-manager-6c548fd776-qgzzn\" (UID: \"fb42365c-18e1-4456-ae16-be77a16f102c\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.693369 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mppjv\" (UniqueName: \"kubernetes.io/projected/e564c6c7-7145-411f-b48f-d8e2594c34a5-kube-api-access-mppjv\") pod \"glance-operator-controller-manager-668d9c48b9-s254f\" (UID: \"e564c6c7-7145-411f-b48f-d8e2594c34a5\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.695150 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.712769 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.714546 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.723749 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7vnjb" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.726011 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg4wl\" (UniqueName: \"kubernetes.io/projected/8c96e02a-d3bd-4904-8ade-baecb4c3a280-kube-api-access-xg4wl\") pod \"barbican-operator-controller-manager-7d9dfd778-jc4r5\" (UID: \"8c96e02a-d3bd-4904-8ade-baecb4c3a280\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.726609 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mppjv\" (UniqueName: \"kubernetes.io/projected/e564c6c7-7145-411f-b48f-d8e2594c34a5-kube-api-access-mppjv\") pod \"glance-operator-controller-manager-668d9c48b9-s254f\" (UID: \"e564c6c7-7145-411f-b48f-d8e2594c34a5\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.727229 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.727592 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt98s\" (UniqueName: \"kubernetes.io/projected/f50eb91c-7f6c-4d0f-b32a-c9ead1766b9c-kube-api-access-kt98s\") pod \"cinder-operator-controller-manager-859b6ccc6-xlmc8\" (UID: \"f50eb91c-7f6c-4d0f-b32a-c9ead1766b9c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.734449 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.738467 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rkk7r" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.745534 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ckwq\" (UniqueName: \"kubernetes.io/projected/95648e23-46bf-4160-9527-7ad1c84f9883-kube-api-access-5ckwq\") pod \"designate-operator-controller-manager-78b4bc895b-g6nl9\" (UID: \"95648e23-46bf-4160-9527-7ad1c84f9883\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.754157 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.761020 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.762985 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.765499 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-m77qn" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.771811 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.780317 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.782276 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.783781 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.785778 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-5qq59" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.791811 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.796609 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sd9s\" (UniqueName: \"kubernetes.io/projected/fb42365c-18e1-4456-ae16-be77a16f102c-kube-api-access-9sd9s\") pod \"ironic-operator-controller-manager-6c548fd776-qgzzn\" (UID: \"fb42365c-18e1-4456-ae16-be77a16f102c\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.796839 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lqdj\" (UniqueName: \"kubernetes.io/projected/7d8912e8-8f81-4ea6-94c2-7e56c7726e58-kube-api-access-5lqdj\") pod \"nova-operator-controller-manager-697bc559fc-7fmfc\" (UID: \"7d8912e8-8f81-4ea6-94c2-7e56c7726e58\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.797004 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw2kz\" (UniqueName: \"kubernetes.io/projected/a1b0453c-c84d-45ea-90be-7f01a831f987-kube-api-access-tw2kz\") pod \"mariadb-operator-controller-manager-56bbcc9d85-48tfk\" (UID: \"a1b0453c-c84d-45ea-90be-7f01a831f987\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.797107 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert\") pod \"infra-operator-controller-manager-57548d458d-6c7gr\" (UID: \"0ee70d2e-b283-468b-8bd8-016a120b5ae8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.797185 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlbgx\" (UniqueName: \"kubernetes.io/projected/09d8a0a7-cc55-4654-8e59-a769c806eecf-kube-api-access-qlbgx\") pod \"horizon-operator-controller-manager-68c6d99b8f-z9q2v\" (UID: \"09d8a0a7-cc55-4654-8e59-a769c806eecf\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.797272 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8nnx\" (UniqueName: \"kubernetes.io/projected/01841a60-a638-4a78-84d4-01ad474bf2fb-kube-api-access-g8nnx\") pod \"heat-operator-controller-manager-5f64f6f8bb-mcfvv\" (UID: \"01841a60-a638-4a78-84d4-01ad474bf2fb\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.797368 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mls8w\" (UniqueName: \"kubernetes.io/projected/2de2c23f-39bd-4a9d-9965-7fe280b61707-kube-api-access-mls8w\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-d84hx\" (UID: \"2de2c23f-39bd-4a9d-9965-7fe280b61707\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.797459 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjgxt\" (UniqueName: \"kubernetes.io/projected/0ee70d2e-b283-468b-8bd8-016a120b5ae8-kube-api-access-gjgxt\") pod \"infra-operator-controller-manager-57548d458d-6c7gr\" (UID: \"0ee70d2e-b283-468b-8bd8-016a120b5ae8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.797534 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vgzf\" (UniqueName: \"kubernetes.io/projected/61fc53be-18c3-48bb-9a4a-2557df78afc7-kube-api-access-2vgzf\") pod \"manila-operator-controller-manager-6546668bfd-5bng6\" (UID: \"61fc53be-18c3-48bb-9a4a-2557df78afc7\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.797608 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69knt\" (UniqueName: \"kubernetes.io/projected/fbdf482e-3aa0-4f5c-a698-949ad6cb6992-kube-api-access-69knt\") pod \"keystone-operator-controller-manager-546d4bdf48-ppdt9\" (UID: \"fbdf482e-3aa0-4f5c-a698-949ad6cb6992\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9" Nov 29 05:41:14 crc kubenswrapper[4594]: E1129 05:41:14.797976 4594 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 05:41:14 crc kubenswrapper[4594]: E1129 05:41:14.798123 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert podName:0ee70d2e-b283-468b-8bd8-016a120b5ae8 nodeName:}" failed. No retries permitted until 2025-11-29 05:41:15.298069719 +0000 UTC m=+799.538578939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert") pod "infra-operator-controller-manager-57548d458d-6c7gr" (UID: "0ee70d2e-b283-468b-8bd8-016a120b5ae8") : secret "infra-operator-webhook-server-cert" not found Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.818707 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-245g8"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.821069 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-245g8" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.824706 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-jqfxg" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.826345 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-245g8"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.830751 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69knt\" (UniqueName: \"kubernetes.io/projected/fbdf482e-3aa0-4f5c-a698-949ad6cb6992-kube-api-access-69knt\") pod \"keystone-operator-controller-manager-546d4bdf48-ppdt9\" (UID: \"fbdf482e-3aa0-4f5c-a698-949ad6cb6992\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.830897 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8nnx\" (UniqueName: \"kubernetes.io/projected/01841a60-a638-4a78-84d4-01ad474bf2fb-kube-api-access-g8nnx\") pod \"heat-operator-controller-manager-5f64f6f8bb-mcfvv\" (UID: \"01841a60-a638-4a78-84d4-01ad474bf2fb\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.833895 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjgxt\" (UniqueName: \"kubernetes.io/projected/0ee70d2e-b283-468b-8bd8-016a120b5ae8-kube-api-access-gjgxt\") pod \"infra-operator-controller-manager-57548d458d-6c7gr\" (UID: \"0ee70d2e-b283-468b-8bd8-016a120b5ae8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.836894 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.834447 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sd9s\" (UniqueName: \"kubernetes.io/projected/fb42365c-18e1-4456-ae16-be77a16f102c-kube-api-access-9sd9s\") pod \"ironic-operator-controller-manager-6c548fd776-qgzzn\" (UID: \"fb42365c-18e1-4456-ae16-be77a16f102c\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.841153 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.842464 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.843705 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlbgx\" (UniqueName: \"kubernetes.io/projected/09d8a0a7-cc55-4654-8e59-a769c806eecf-kube-api-access-qlbgx\") pod \"horizon-operator-controller-manager-68c6d99b8f-z9q2v\" (UID: \"09d8a0a7-cc55-4654-8e59-a769c806eecf\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.846392 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.847657 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.850476 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.850529 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-54zsr" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.853683 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.853910 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-glknz" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.856374 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.861861 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.863468 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.873690 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.874832 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.877474 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-9kds4" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.886327 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.886812 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.895268 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.896404 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.898052 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rwksj" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.901137 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2pql\" (UniqueName: \"kubernetes.io/projected/f9ab855a-f938-4ad6-941a-52f4e5b7d4b2-kube-api-access-h2pql\") pod \"octavia-operator-controller-manager-998648c74-245g8\" (UID: \"f9ab855a-f938-4ad6-941a-52f4e5b7d4b2\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-245g8" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.901198 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lqdj\" (UniqueName: \"kubernetes.io/projected/7d8912e8-8f81-4ea6-94c2-7e56c7726e58-kube-api-access-5lqdj\") pod \"nova-operator-controller-manager-697bc559fc-7fmfc\" (UID: \"7d8912e8-8f81-4ea6-94c2-7e56c7726e58\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.901373 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.904436 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzxrl\" (UniqueName: \"kubernetes.io/projected/82d8e084-bca0-43f0-9d6f-63df84cd28a6-kube-api-access-zzxrl\") pod \"openstack-baremetal-operator-controller-manager-6698bcb446zrvcn\" (UID: \"82d8e084-bca0-43f0-9d6f-63df84cd28a6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.904521 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert\") pod \"openstack-baremetal-operator-controller-manager-6698bcb446zrvcn\" (UID: \"82d8e084-bca0-43f0-9d6f-63df84cd28a6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.904598 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn46c\" (UniqueName: \"kubernetes.io/projected/5cde207a-7c2e-46d9-809a-72b8749560a6-kube-api-access-kn46c\") pod \"placement-operator-controller-manager-78f8948974-7n6mf\" (UID: \"5cde207a-7c2e-46d9-809a-72b8749560a6\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.904631 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw2kz\" (UniqueName: \"kubernetes.io/projected/a1b0453c-c84d-45ea-90be-7f01a831f987-kube-api-access-tw2kz\") pod \"mariadb-operator-controller-manager-56bbcc9d85-48tfk\" (UID: \"a1b0453c-c84d-45ea-90be-7f01a831f987\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.904690 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mls8w\" (UniqueName: \"kubernetes.io/projected/2de2c23f-39bd-4a9d-9965-7fe280b61707-kube-api-access-mls8w\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-d84hx\" (UID: \"2de2c23f-39bd-4a9d-9965-7fe280b61707\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.904719 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzbtw\" (UniqueName: \"kubernetes.io/projected/460651bc-3f62-4bc6-ab53-e791ea16993e-kube-api-access-tzbtw\") pod \"ovn-operator-controller-manager-b6456fdb6-pclb6\" (UID: \"460651bc-3f62-4bc6-ab53-e791ea16993e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.904757 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vgzf\" (UniqueName: \"kubernetes.io/projected/61fc53be-18c3-48bb-9a4a-2557df78afc7-kube-api-access-2vgzf\") pod \"manila-operator-controller-manager-6546668bfd-5bng6\" (UID: \"61fc53be-18c3-48bb-9a4a-2557df78afc7\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.913862 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.925006 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vgzf\" (UniqueName: \"kubernetes.io/projected/61fc53be-18c3-48bb-9a4a-2557df78afc7-kube-api-access-2vgzf\") pod \"manila-operator-controller-manager-6546668bfd-5bng6\" (UID: \"61fc53be-18c3-48bb-9a4a-2557df78afc7\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.930869 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mls8w\" (UniqueName: \"kubernetes.io/projected/2de2c23f-39bd-4a9d-9965-7fe280b61707-kube-api-access-mls8w\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-d84hx\" (UID: \"2de2c23f-39bd-4a9d-9965-7fe280b61707\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.931017 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.934355 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lqdj\" (UniqueName: \"kubernetes.io/projected/7d8912e8-8f81-4ea6-94c2-7e56c7726e58-kube-api-access-5lqdj\") pod \"nova-operator-controller-manager-697bc559fc-7fmfc\" (UID: \"7d8912e8-8f81-4ea6-94c2-7e56c7726e58\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.936942 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw2kz\" (UniqueName: \"kubernetes.io/projected/a1b0453c-c84d-45ea-90be-7f01a831f987-kube-api-access-tw2kz\") pod \"mariadb-operator-controller-manager-56bbcc9d85-48tfk\" (UID: \"a1b0453c-c84d-45ea-90be-7f01a831f987\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.942690 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.943788 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.946201 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx"] Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.948660 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-5xvnl" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.954458 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn" Nov 29 05:41:14 crc kubenswrapper[4594]: I1129 05:41:14.967733 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.013506 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzbtw\" (UniqueName: \"kubernetes.io/projected/460651bc-3f62-4bc6-ab53-e791ea16993e-kube-api-access-tzbtw\") pod \"ovn-operator-controller-manager-b6456fdb6-pclb6\" (UID: \"460651bc-3f62-4bc6-ab53-e791ea16993e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.013559 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4q5w\" (UniqueName: \"kubernetes.io/projected/d8cfbdeb-cd3d-425e-a96c-d9a565c840c3-kube-api-access-m4q5w\") pod \"swift-operator-controller-manager-5f8c65bbfc-dd8x6\" (UID: \"d8cfbdeb-cd3d-425e-a96c-d9a565c840c3\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.013617 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg4df\" (UniqueName: \"kubernetes.io/projected/b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0-kube-api-access-hg4df\") pod \"telemetry-operator-controller-manager-76cc84c6bb-zccgx\" (UID: \"b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.013652 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2pql\" (UniqueName: \"kubernetes.io/projected/f9ab855a-f938-4ad6-941a-52f4e5b7d4b2-kube-api-access-h2pql\") pod \"octavia-operator-controller-manager-998648c74-245g8\" (UID: \"f9ab855a-f938-4ad6-941a-52f4e5b7d4b2\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-245g8" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.013680 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzxrl\" (UniqueName: \"kubernetes.io/projected/82d8e084-bca0-43f0-9d6f-63df84cd28a6-kube-api-access-zzxrl\") pod \"openstack-baremetal-operator-controller-manager-6698bcb446zrvcn\" (UID: \"82d8e084-bca0-43f0-9d6f-63df84cd28a6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.013704 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert\") pod \"openstack-baremetal-operator-controller-manager-6698bcb446zrvcn\" (UID: \"82d8e084-bca0-43f0-9d6f-63df84cd28a6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.013773 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn46c\" (UniqueName: \"kubernetes.io/projected/5cde207a-7c2e-46d9-809a-72b8749560a6-kube-api-access-kn46c\") pod \"placement-operator-controller-manager-78f8948974-7n6mf\" (UID: \"5cde207a-7c2e-46d9-809a-72b8749560a6\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" Nov 29 05:41:15 crc kubenswrapper[4594]: E1129 05:41:15.014727 4594 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 05:41:15 crc kubenswrapper[4594]: E1129 05:41:15.014784 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert podName:82d8e084-bca0-43f0-9d6f-63df84cd28a6 nodeName:}" failed. No retries permitted until 2025-11-29 05:41:15.514770425 +0000 UTC m=+799.755279645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert") pod "openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" (UID: "82d8e084-bca0-43f0-9d6f-63df84cd28a6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.033824 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn46c\" (UniqueName: \"kubernetes.io/projected/5cde207a-7c2e-46d9-809a-72b8749560a6-kube-api-access-kn46c\") pod \"placement-operator-controller-manager-78f8948974-7n6mf\" (UID: \"5cde207a-7c2e-46d9-809a-72b8749560a6\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.036268 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-r28h2"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.040234 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.051786 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-s8b5h" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.060506 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-r28h2"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.081748 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.082868 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzbtw\" (UniqueName: \"kubernetes.io/projected/460651bc-3f62-4bc6-ab53-e791ea16993e-kube-api-access-tzbtw\") pod \"ovn-operator-controller-manager-b6456fdb6-pclb6\" (UID: \"460651bc-3f62-4bc6-ab53-e791ea16993e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.082995 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2pql\" (UniqueName: \"kubernetes.io/projected/f9ab855a-f938-4ad6-941a-52f4e5b7d4b2-kube-api-access-h2pql\") pod \"octavia-operator-controller-manager-998648c74-245g8\" (UID: \"f9ab855a-f938-4ad6-941a-52f4e5b7d4b2\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-245g8" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.084782 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzxrl\" (UniqueName: \"kubernetes.io/projected/82d8e084-bca0-43f0-9d6f-63df84cd28a6-kube-api-access-zzxrl\") pod \"openstack-baremetal-operator-controller-manager-6698bcb446zrvcn\" (UID: \"82d8e084-bca0-43f0-9d6f-63df84cd28a6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.095021 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.119390 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4q5w\" (UniqueName: \"kubernetes.io/projected/d8cfbdeb-cd3d-425e-a96c-d9a565c840c3-kube-api-access-m4q5w\") pod \"swift-operator-controller-manager-5f8c65bbfc-dd8x6\" (UID: \"d8cfbdeb-cd3d-425e-a96c-d9a565c840c3\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.120346 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg4df\" (UniqueName: \"kubernetes.io/projected/b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0-kube-api-access-hg4df\") pod \"telemetry-operator-controller-manager-76cc84c6bb-zccgx\" (UID: \"b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.120463 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp7x5\" (UniqueName: \"kubernetes.io/projected/9e1ee1b6-8684-4fcb-a26c-fd85a950abcc-kube-api-access-kp7x5\") pod \"test-operator-controller-manager-5854674fcc-r28h2\" (UID: \"9e1ee1b6-8684-4fcb-a26c-fd85a950abcc\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.119938 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.134673 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.139741 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.140236 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg4df\" (UniqueName: \"kubernetes.io/projected/b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0-kube-api-access-hg4df\") pod \"telemetry-operator-controller-manager-76cc84c6bb-zccgx\" (UID: \"b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.142086 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4q5w\" (UniqueName: \"kubernetes.io/projected/d8cfbdeb-cd3d-425e-a96c-d9a565c840c3-kube-api-access-m4q5w\") pod \"swift-operator-controller-manager-5f8c65bbfc-dd8x6\" (UID: \"d8cfbdeb-cd3d-425e-a96c-d9a565c840c3\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.142274 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-qjt9w" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.173229 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.189431 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.205493 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-245g8" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.223139 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sxb4\" (UniqueName: \"kubernetes.io/projected/8a5e35f1-00df-4307-b197-f7800c641af7-kube-api-access-5sxb4\") pod \"watcher-operator-controller-manager-769dc69bc-6gxvv\" (UID: \"8a5e35f1-00df-4307-b197-f7800c641af7\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.223229 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp7x5\" (UniqueName: \"kubernetes.io/projected/9e1ee1b6-8684-4fcb-a26c-fd85a950abcc-kube-api-access-kp7x5\") pod \"test-operator-controller-manager-5854674fcc-r28h2\" (UID: \"9e1ee1b6-8684-4fcb-a26c-fd85a950abcc\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.243768 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp7x5\" (UniqueName: \"kubernetes.io/projected/9e1ee1b6-8684-4fcb-a26c-fd85a950abcc-kube-api-access-kp7x5\") pod \"test-operator-controller-manager-5854674fcc-r28h2\" (UID: \"9e1ee1b6-8684-4fcb-a26c-fd85a950abcc\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.244100 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.252345 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.253627 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.255242 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.259083 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.259773 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.259806 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-zjrzd" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.261019 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.262834 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.273023 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.333111 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert\") pod \"infra-operator-controller-manager-57548d458d-6c7gr\" (UID: \"0ee70d2e-b283-468b-8bd8-016a120b5ae8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.333189 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sxb4\" (UniqueName: \"kubernetes.io/projected/8a5e35f1-00df-4307-b197-f7800c641af7-kube-api-access-5sxb4\") pod \"watcher-operator-controller-manager-769dc69bc-6gxvv\" (UID: \"8a5e35f1-00df-4307-b197-f7800c641af7\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" Nov 29 05:41:15 crc kubenswrapper[4594]: E1129 05:41:15.333444 4594 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 05:41:15 crc kubenswrapper[4594]: E1129 05:41:15.333526 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert podName:0ee70d2e-b283-468b-8bd8-016a120b5ae8 nodeName:}" failed. No retries permitted until 2025-11-29 05:41:16.333510434 +0000 UTC m=+800.574019653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert") pod "infra-operator-controller-manager-57548d458d-6c7gr" (UID: "0ee70d2e-b283-468b-8bd8-016a120b5ae8") : secret "infra-operator-webhook-server-cert" not found Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.334734 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.344664 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.345608 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.354872 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-46rnj" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.360183 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sxb4\" (UniqueName: \"kubernetes.io/projected/8a5e35f1-00df-4307-b197-f7800c641af7-kube-api-access-5sxb4\") pod \"watcher-operator-controller-manager-769dc69bc-6gxvv\" (UID: \"8a5e35f1-00df-4307-b197-f7800c641af7\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.366472 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r"] Nov 29 05:41:15 crc kubenswrapper[4594]: W1129 05:41:15.387146 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c96e02a_d3bd_4904_8ade_baecb4c3a280.slice/crio-483c1c21d15312a9bf8e947445db458d401212ced5165cfb9d76ec2865f859a3 WatchSource:0}: Error finding container 483c1c21d15312a9bf8e947445db458d401212ced5165cfb9d76ec2865f859a3: Status 404 returned error can't find the container with id 483c1c21d15312a9bf8e947445db458d401212ced5165cfb9d76ec2865f859a3 Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.417877 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.434444 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.434950 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.434980 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44b8f\" (UniqueName: \"kubernetes.io/projected/505c6e79-1776-4995-a6b5-5888f75c141c-kube-api-access-44b8f\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.465067 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.505098 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.519742 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8"] Nov 29 05:41:15 crc kubenswrapper[4594]: W1129 05:41:15.527285 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf50eb91c_7f6c_4d0f_b32a_c9ead1766b9c.slice/crio-01828c26c733c930d1768edac5320f18702dd2bd6bbbc95ea9bc078b111ca55d WatchSource:0}: Error finding container 01828c26c733c930d1768edac5320f18702dd2bd6bbbc95ea9bc078b111ca55d: Status 404 returned error can't find the container with id 01828c26c733c930d1768edac5320f18702dd2bd6bbbc95ea9bc078b111ca55d Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.536422 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44b8f\" (UniqueName: \"kubernetes.io/projected/505c6e79-1776-4995-a6b5-5888f75c141c-kube-api-access-44b8f\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.536498 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.536621 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzrjx\" (UniqueName: \"kubernetes.io/projected/3fdff03c-acc7-4274-bb06-83abd0f7b432-kube-api-access-fzrjx\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4mk5r\" (UID: \"3fdff03c-acc7-4274-bb06-83abd0f7b432\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.536678 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert\") pod \"openstack-baremetal-operator-controller-manager-6698bcb446zrvcn\" (UID: \"82d8e084-bca0-43f0-9d6f-63df84cd28a6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.536765 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:15 crc kubenswrapper[4594]: E1129 05:41:15.536908 4594 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 05:41:15 crc kubenswrapper[4594]: E1129 05:41:15.536930 4594 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 05:41:15 crc kubenswrapper[4594]: E1129 05:41:15.536964 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs podName:505c6e79-1776-4995-a6b5-5888f75c141c nodeName:}" failed. No retries permitted until 2025-11-29 05:41:16.036945016 +0000 UTC m=+800.277454227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs") pod "openstack-operator-controller-manager-656fd97d56-fcmzw" (UID: "505c6e79-1776-4995-a6b5-5888f75c141c") : secret "webhook-server-cert" not found Nov 29 05:41:15 crc kubenswrapper[4594]: E1129 05:41:15.536983 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs podName:505c6e79-1776-4995-a6b5-5888f75c141c nodeName:}" failed. No retries permitted until 2025-11-29 05:41:16.03697337 +0000 UTC m=+800.277482590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs") pod "openstack-operator-controller-manager-656fd97d56-fcmzw" (UID: "505c6e79-1776-4995-a6b5-5888f75c141c") : secret "metrics-server-cert" not found Nov 29 05:41:15 crc kubenswrapper[4594]: E1129 05:41:15.537044 4594 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 05:41:15 crc kubenswrapper[4594]: E1129 05:41:15.537078 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert podName:82d8e084-bca0-43f0-9d6f-63df84cd28a6 nodeName:}" failed. No retries permitted until 2025-11-29 05:41:16.537070242 +0000 UTC m=+800.777579462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert") pod "openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" (UID: "82d8e084-bca0-43f0-9d6f-63df84cd28a6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.559836 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44b8f\" (UniqueName: \"kubernetes.io/projected/505c6e79-1776-4995-a6b5-5888f75c141c-kube-api-access-44b8f\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.638421 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzrjx\" (UniqueName: \"kubernetes.io/projected/3fdff03c-acc7-4274-bb06-83abd0f7b432-kube-api-access-fzrjx\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4mk5r\" (UID: \"3fdff03c-acc7-4274-bb06-83abd0f7b432\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.652926 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzrjx\" (UniqueName: \"kubernetes.io/projected/3fdff03c-acc7-4274-bb06-83abd0f7b432-kube-api-access-fzrjx\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4mk5r\" (UID: \"3fdff03c-acc7-4274-bb06-83abd0f7b432\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.664699 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.686013 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.714943 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.734263 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.744119 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.749387 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9"] Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.752583 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn"] Nov 29 05:41:15 crc kubenswrapper[4594]: W1129 05:41:15.776407 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbdf482e_3aa0_4f5c_a698_949ad6cb6992.slice/crio-cb6a4739824128db9a5d3e34526727509f745b55b47af2a4e4b037ad718fc17f WatchSource:0}: Error finding container cb6a4739824128db9a5d3e34526727509f745b55b47af2a4e4b037ad718fc17f: Status 404 returned error can't find the container with id cb6a4739824128db9a5d3e34526727509f745b55b47af2a4e4b037ad718fc17f Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.800745 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.800809 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.898515 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk"] Nov 29 05:41:15 crc kubenswrapper[4594]: W1129 05:41:15.905891 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b0453c_c84d_45ea_90be_7f01a831f987.slice/crio-88be5eac69b24eff4aa140bfa67bee35c82109c0bef62230c5a1ae477bc5350b WatchSource:0}: Error finding container 88be5eac69b24eff4aa140bfa67bee35c82109c0bef62230c5a1ae477bc5350b: Status 404 returned error can't find the container with id 88be5eac69b24eff4aa140bfa67bee35c82109c0bef62230c5a1ae477bc5350b Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.910704 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx"] Nov 29 05:41:15 crc kubenswrapper[4594]: W1129 05:41:15.911759 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2de2c23f_39bd_4a9d_9965_7fe280b61707.slice/crio-1f708c64a609e835e18eaa571a2ed20c3ce04e7962d9a44c75b1694c151df29b WatchSource:0}: Error finding container 1f708c64a609e835e18eaa571a2ed20c3ce04e7962d9a44c75b1694c151df29b: Status 404 returned error can't find the container with id 1f708c64a609e835e18eaa571a2ed20c3ce04e7962d9a44c75b1694c151df29b Nov 29 05:41:15 crc kubenswrapper[4594]: W1129 05:41:15.913530 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9ab855a_f938_4ad6_941a_52f4e5b7d4b2.slice/crio-7be10d1e108233c6008990fa07fcf4078eacda9a1c3ca6cd0dd4ca36fb31c0df WatchSource:0}: Error finding container 7be10d1e108233c6008990fa07fcf4078eacda9a1c3ca6cd0dd4ca36fb31c0df: Status 404 returned error can't find the container with id 7be10d1e108233c6008990fa07fcf4078eacda9a1c3ca6cd0dd4ca36fb31c0df Nov 29 05:41:15 crc kubenswrapper[4594]: I1129 05:41:15.919538 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-245g8"] Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.002993 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6"] Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.013032 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6"] Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.016848 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tzbtw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-pclb6_openstack-operators(460651bc-3f62-4bc6-ab53-e791ea16993e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.019559 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tzbtw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-pclb6_openstack-operators(460651bc-3f62-4bc6-ab53-e791ea16993e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.020672 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" podUID="460651bc-3f62-4bc6-ab53-e791ea16993e" Nov 29 05:41:16 crc kubenswrapper[4594]: W1129 05:41:16.023610 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb818ec3c_b3d3_4c0b_b2d0_dc75c1e0a4c0.slice/crio-8565b34aebdab173e49e87788f00369b53f1195aa11739ad2b067f583ba82c8b WatchSource:0}: Error finding container 8565b34aebdab173e49e87788f00369b53f1195aa11739ad2b067f583ba82c8b: Status 404 returned error can't find the container with id 8565b34aebdab173e49e87788f00369b53f1195aa11739ad2b067f583ba82c8b Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.027802 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx"] Nov 29 05:41:16 crc kubenswrapper[4594]: W1129 05:41:16.029995 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a5e35f1_00df_4307_b197_f7800c641af7.slice/crio-6716d4a6e48f9c7bd02fc8a7374890cb0ad72ae6edec364928692fe4c119fdd2 WatchSource:0}: Error finding container 6716d4a6e48f9c7bd02fc8a7374890cb0ad72ae6edec364928692fe4c119fdd2: Status 404 returned error can't find the container with id 6716d4a6e48f9c7bd02fc8a7374890cb0ad72ae6edec364928692fe4c119fdd2 Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.031235 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hg4df,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-zccgx_openstack-operators(b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.033644 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hg4df,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-zccgx_openstack-operators(b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.034737 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" podUID="b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0" Nov 29 05:41:16 crc kubenswrapper[4594]: W1129 05:41:16.035764 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d8912e8_8f81_4ea6_94c2_7e56c7726e58.slice/crio-bedcf98b231a4760b1f80b8657f05e718a5d3a2847f70387774e825956bda068 WatchSource:0}: Error finding container bedcf98b231a4760b1f80b8657f05e718a5d3a2847f70387774e825956bda068: Status 404 returned error can't find the container with id bedcf98b231a4760b1f80b8657f05e718a5d3a2847f70387774e825956bda068 Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.036524 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5sxb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-6gxvv_openstack-operators(8a5e35f1-00df-4307-b197-f7800c641af7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.039386 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5sxb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-6gxvv_openstack-operators(8a5e35f1-00df-4307-b197-f7800c641af7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.041724 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" podUID="8a5e35f1-00df-4307-b197-f7800c641af7" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.043103 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5lqdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-7fmfc_openstack-operators(7d8912e8-8f81-4ea6-94c2-7e56c7726e58): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.046733 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5lqdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-7fmfc_openstack-operators(7d8912e8-8f81-4ea6-94c2-7e56c7726e58): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.047961 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.048062 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.048200 4594 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.048317 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs podName:505c6e79-1776-4995-a6b5-5888f75c141c nodeName:}" failed. No retries permitted until 2025-11-29 05:41:17.048239441 +0000 UTC m=+801.288748660 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs") pod "openstack-operator-controller-manager-656fd97d56-fcmzw" (UID: "505c6e79-1776-4995-a6b5-5888f75c141c") : secret "webhook-server-cert" not found Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.048409 4594 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.048456 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs podName:505c6e79-1776-4995-a6b5-5888f75c141c nodeName:}" failed. No retries permitted until 2025-11-29 05:41:17.048444276 +0000 UTC m=+801.288953496 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs") pod "openstack-operator-controller-manager-656fd97d56-fcmzw" (UID: "505c6e79-1776-4995-a6b5-5888f75c141c") : secret "metrics-server-cert" not found Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.049552 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" podUID="7d8912e8-8f81-4ea6-94c2-7e56c7726e58" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.055861 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv"] Nov 29 05:41:16 crc kubenswrapper[4594]: W1129 05:41:16.061117 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cde207a_7c2e_46d9_809a_72b8749560a6.slice/crio-73536a12281f1b03c70117be29ff495b76e20a339825534ed853e180a2ba5015 WatchSource:0}: Error finding container 73536a12281f1b03c70117be29ff495b76e20a339825534ed853e180a2ba5015: Status 404 returned error can't find the container with id 73536a12281f1b03c70117be29ff495b76e20a339825534ed853e180a2ba5015 Nov 29 05:41:16 crc kubenswrapper[4594]: W1129 05:41:16.063390 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e1ee1b6_8684_4fcb_a26c_fd85a950abcc.slice/crio-ed658fa9e68790faeb918352755d8f0bd501527c5e91a9e8109cd36b91c6a75a WatchSource:0}: Error finding container ed658fa9e68790faeb918352755d8f0bd501527c5e91a9e8109cd36b91c6a75a: Status 404 returned error can't find the container with id ed658fa9e68790faeb918352755d8f0bd501527c5e91a9e8109cd36b91c6a75a Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.065777 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kp7x5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-r28h2_openstack-operators(9e1ee1b6-8684-4fcb-a26c-fd85a950abcc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.066873 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc"] Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.068998 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kn46c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-7n6mf_openstack-operators(5cde207a-7c2e-46d9-809a-72b8749560a6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.069277 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kp7x5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-r28h2_openstack-operators(9e1ee1b6-8684-4fcb-a26c-fd85a950abcc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.070388 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" podUID="9e1ee1b6-8684-4fcb-a26c-fd85a950abcc" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.071636 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kn46c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-7n6mf_openstack-operators(5cde207a-7c2e-46d9-809a-72b8749560a6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.072858 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" podUID="5cde207a-7c2e-46d9-809a-72b8749560a6" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.074982 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-r28h2"] Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.078926 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf"] Nov 29 05:41:16 crc kubenswrapper[4594]: W1129 05:41:16.176651 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fdff03c_acc7_4274_bb06_83abd0f7b432.slice/crio-0548539105fa825fc09e078ddcd4c30693184b51641a396f8ef667a7f315590d WatchSource:0}: Error finding container 0548539105fa825fc09e078ddcd4c30693184b51641a396f8ef667a7f315590d: Status 404 returned error can't find the container with id 0548539105fa825fc09e078ddcd4c30693184b51641a396f8ef667a7f315590d Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.179276 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fzrjx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-4mk5r_openstack-operators(3fdff03c-acc7-4274-bb06-83abd0f7b432): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.179428 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r"] Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.180785 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r" podUID="3fdff03c-acc7-4274-bb06-83abd0f7b432" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.271561 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" event={"ID":"8a5e35f1-00df-4307-b197-f7800c641af7","Type":"ContainerStarted","Data":"6716d4a6e48f9c7bd02fc8a7374890cb0ad72ae6edec364928692fe4c119fdd2"} Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.273940 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" podUID="8a5e35f1-00df-4307-b197-f7800c641af7" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.274503 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn" event={"ID":"fb42365c-18e1-4456-ae16-be77a16f102c","Type":"ContainerStarted","Data":"51a05168709b21009f7db7cc8e0b8d124976f15411ecf8f8d5cebea19186c74e"} Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.275768 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v" event={"ID":"09d8a0a7-cc55-4654-8e59-a769c806eecf","Type":"ContainerStarted","Data":"46cd10b6d2ebb1aef46c6429eecc8da490a8eb97724594dacf2ef3e1156751b4"} Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.276847 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8" event={"ID":"f50eb91c-7f6c-4d0f-b32a-c9ead1766b9c","Type":"ContainerStarted","Data":"01828c26c733c930d1768edac5320f18702dd2bd6bbbc95ea9bc078b111ca55d"} Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.277921 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk" event={"ID":"a1b0453c-c84d-45ea-90be-7f01a831f987","Type":"ContainerStarted","Data":"88be5eac69b24eff4aa140bfa67bee35c82109c0bef62230c5a1ae477bc5350b"} Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.279162 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" event={"ID":"b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0","Type":"ContainerStarted","Data":"8565b34aebdab173e49e87788f00369b53f1195aa11739ad2b067f583ba82c8b"} Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.280775 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" event={"ID":"9e1ee1b6-8684-4fcb-a26c-fd85a950abcc","Type":"ContainerStarted","Data":"ed658fa9e68790faeb918352755d8f0bd501527c5e91a9e8109cd36b91c6a75a"} Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.280835 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" podUID="b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.282107 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9" event={"ID":"95648e23-46bf-4160-9527-7ad1c84f9883","Type":"ContainerStarted","Data":"378086c35b2cddd6b0d3085b3a7222cf164c43ccd77db60dcd51eb6f20e0c075"} Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.282678 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" podUID="9e1ee1b6-8684-4fcb-a26c-fd85a950abcc" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.283541 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" event={"ID":"5cde207a-7c2e-46d9-809a-72b8749560a6","Type":"ContainerStarted","Data":"73536a12281f1b03c70117be29ff495b76e20a339825534ed853e180a2ba5015"} Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.286063 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-245g8" event={"ID":"f9ab855a-f938-4ad6-941a-52f4e5b7d4b2","Type":"ContainerStarted","Data":"7be10d1e108233c6008990fa07fcf4078eacda9a1c3ca6cd0dd4ca36fb31c0df"} Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.287467 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" podUID="5cde207a-7c2e-46d9-809a-72b8749560a6" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.289767 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx" event={"ID":"2de2c23f-39bd-4a9d-9965-7fe280b61707","Type":"ContainerStarted","Data":"1f708c64a609e835e18eaa571a2ed20c3ce04e7962d9a44c75b1694c151df29b"} Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.290658 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" event={"ID":"460651bc-3f62-4bc6-ab53-e791ea16993e","Type":"ContainerStarted","Data":"875f0161e4e1d97dec560a91d7e40fa5486b11ba43ff1b1f7d3e44b61c87c32e"} Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.292451 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" podUID="460651bc-3f62-4bc6-ab53-e791ea16993e" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.292672 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5" event={"ID":"8c96e02a-d3bd-4904-8ade-baecb4c3a280","Type":"ContainerStarted","Data":"483c1c21d15312a9bf8e947445db458d401212ced5165cfb9d76ec2865f859a3"} Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.294270 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r" event={"ID":"3fdff03c-acc7-4274-bb06-83abd0f7b432","Type":"ContainerStarted","Data":"0548539105fa825fc09e078ddcd4c30693184b51641a396f8ef667a7f315590d"} Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.295512 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" event={"ID":"01841a60-a638-4a78-84d4-01ad474bf2fb","Type":"ContainerStarted","Data":"bbe7adbc008052f26faca274b224fa79243b66706721cb2fb9f2ecb4af1b9e83"} Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.295753 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r" podUID="3fdff03c-acc7-4274-bb06-83abd0f7b432" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.297309 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" event={"ID":"61fc53be-18c3-48bb-9a4a-2557df78afc7","Type":"ContainerStarted","Data":"bc00ba26afc3811f4cd8066d35b2faa0cc2eecf262ea5d615d47fe2bcba6d011"} Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.300794 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9" event={"ID":"fbdf482e-3aa0-4f5c-a698-949ad6cb6992","Type":"ContainerStarted","Data":"cb6a4739824128db9a5d3e34526727509f745b55b47af2a4e4b037ad718fc17f"} Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.303684 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" event={"ID":"7d8912e8-8f81-4ea6-94c2-7e56c7726e58","Type":"ContainerStarted","Data":"bedcf98b231a4760b1f80b8657f05e718a5d3a2847f70387774e825956bda068"} Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.304936 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f" event={"ID":"e564c6c7-7145-411f-b48f-d8e2594c34a5","Type":"ContainerStarted","Data":"0b6f61f3a45cb8e201f7ff5df2556faf79a25632805cc9c836fb6047e42c8104"} Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.306067 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" podUID="7d8912e8-8f81-4ea6-94c2-7e56c7726e58" Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.306188 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" event={"ID":"d8cfbdeb-cd3d-425e-a96c-d9a565c840c3","Type":"ContainerStarted","Data":"79c0145a9bd5bbfb2c96389e5e559b838b3c211b4daec9f9fcb48f51a2d9d43b"} Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.353315 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert\") pod \"infra-operator-controller-manager-57548d458d-6c7gr\" (UID: \"0ee70d2e-b283-468b-8bd8-016a120b5ae8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.353548 4594 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.353685 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert podName:0ee70d2e-b283-468b-8bd8-016a120b5ae8 nodeName:}" failed. No retries permitted until 2025-11-29 05:41:18.353666596 +0000 UTC m=+802.594175817 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert") pod "infra-operator-controller-manager-57548d458d-6c7gr" (UID: "0ee70d2e-b283-468b-8bd8-016a120b5ae8") : secret "infra-operator-webhook-server-cert" not found Nov 29 05:41:16 crc kubenswrapper[4594]: I1129 05:41:16.563341 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert\") pod \"openstack-baremetal-operator-controller-manager-6698bcb446zrvcn\" (UID: \"82d8e084-bca0-43f0-9d6f-63df84cd28a6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.563562 4594 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 05:41:16 crc kubenswrapper[4594]: E1129 05:41:16.563627 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert podName:82d8e084-bca0-43f0-9d6f-63df84cd28a6 nodeName:}" failed. No retries permitted until 2025-11-29 05:41:18.563609727 +0000 UTC m=+802.804118958 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert") pod "openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" (UID: "82d8e084-bca0-43f0-9d6f-63df84cd28a6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 05:41:17 crc kubenswrapper[4594]: I1129 05:41:17.074650 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:17 crc kubenswrapper[4594]: I1129 05:41:17.074969 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:17 crc kubenswrapper[4594]: E1129 05:41:17.075105 4594 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 05:41:17 crc kubenswrapper[4594]: E1129 05:41:17.075153 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs podName:505c6e79-1776-4995-a6b5-5888f75c141c nodeName:}" failed. No retries permitted until 2025-11-29 05:41:19.075137751 +0000 UTC m=+803.315646971 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs") pod "openstack-operator-controller-manager-656fd97d56-fcmzw" (UID: "505c6e79-1776-4995-a6b5-5888f75c141c") : secret "metrics-server-cert" not found Nov 29 05:41:17 crc kubenswrapper[4594]: E1129 05:41:17.075381 4594 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 05:41:17 crc kubenswrapper[4594]: E1129 05:41:17.075585 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs podName:505c6e79-1776-4995-a6b5-5888f75c141c nodeName:}" failed. No retries permitted until 2025-11-29 05:41:19.075480094 +0000 UTC m=+803.315989314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs") pod "openstack-operator-controller-manager-656fd97d56-fcmzw" (UID: "505c6e79-1776-4995-a6b5-5888f75c141c") : secret "webhook-server-cert" not found Nov 29 05:41:17 crc kubenswrapper[4594]: E1129 05:41:17.322350 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" podUID="9e1ee1b6-8684-4fcb-a26c-fd85a950abcc" Nov 29 05:41:17 crc kubenswrapper[4594]: E1129 05:41:17.322369 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r" podUID="3fdff03c-acc7-4274-bb06-83abd0f7b432" Nov 29 05:41:17 crc kubenswrapper[4594]: E1129 05:41:17.322384 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" podUID="7d8912e8-8f81-4ea6-94c2-7e56c7726e58" Nov 29 05:41:17 crc kubenswrapper[4594]: E1129 05:41:17.322905 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" podUID="5cde207a-7c2e-46d9-809a-72b8749560a6" Nov 29 05:41:17 crc kubenswrapper[4594]: E1129 05:41:17.323720 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" podUID="460651bc-3f62-4bc6-ab53-e791ea16993e" Nov 29 05:41:17 crc kubenswrapper[4594]: E1129 05:41:17.323726 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" podUID="b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0" Nov 29 05:41:17 crc kubenswrapper[4594]: E1129 05:41:17.323826 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" podUID="8a5e35f1-00df-4307-b197-f7800c641af7" Nov 29 05:41:18 crc kubenswrapper[4594]: I1129 05:41:18.394025 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert\") pod \"infra-operator-controller-manager-57548d458d-6c7gr\" (UID: \"0ee70d2e-b283-468b-8bd8-016a120b5ae8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:18 crc kubenswrapper[4594]: E1129 05:41:18.394180 4594 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 05:41:18 crc kubenswrapper[4594]: E1129 05:41:18.394225 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert podName:0ee70d2e-b283-468b-8bd8-016a120b5ae8 nodeName:}" failed. No retries permitted until 2025-11-29 05:41:22.394210689 +0000 UTC m=+806.634719909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert") pod "infra-operator-controller-manager-57548d458d-6c7gr" (UID: "0ee70d2e-b283-468b-8bd8-016a120b5ae8") : secret "infra-operator-webhook-server-cert" not found Nov 29 05:41:18 crc kubenswrapper[4594]: I1129 05:41:18.596848 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert\") pod \"openstack-baremetal-operator-controller-manager-6698bcb446zrvcn\" (UID: \"82d8e084-bca0-43f0-9d6f-63df84cd28a6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:18 crc kubenswrapper[4594]: E1129 05:41:18.597016 4594 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 05:41:18 crc kubenswrapper[4594]: E1129 05:41:18.597089 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert podName:82d8e084-bca0-43f0-9d6f-63df84cd28a6 nodeName:}" failed. No retries permitted until 2025-11-29 05:41:22.597070592 +0000 UTC m=+806.837579812 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert") pod "openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" (UID: "82d8e084-bca0-43f0-9d6f-63df84cd28a6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 05:41:19 crc kubenswrapper[4594]: I1129 05:41:19.106211 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:19 crc kubenswrapper[4594]: I1129 05:41:19.106369 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:19 crc kubenswrapper[4594]: E1129 05:41:19.106459 4594 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 05:41:19 crc kubenswrapper[4594]: E1129 05:41:19.106530 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs podName:505c6e79-1776-4995-a6b5-5888f75c141c nodeName:}" failed. No retries permitted until 2025-11-29 05:41:23.10650615 +0000 UTC m=+807.347015370 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs") pod "openstack-operator-controller-manager-656fd97d56-fcmzw" (UID: "505c6e79-1776-4995-a6b5-5888f75c141c") : secret "metrics-server-cert" not found Nov 29 05:41:19 crc kubenswrapper[4594]: E1129 05:41:19.107047 4594 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 05:41:19 crc kubenswrapper[4594]: E1129 05:41:19.107101 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs podName:505c6e79-1776-4995-a6b5-5888f75c141c nodeName:}" failed. No retries permitted until 2025-11-29 05:41:23.107092512 +0000 UTC m=+807.347601732 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs") pod "openstack-operator-controller-manager-656fd97d56-fcmzw" (UID: "505c6e79-1776-4995-a6b5-5888f75c141c") : secret "webhook-server-cert" not found Nov 29 05:41:22 crc kubenswrapper[4594]: I1129 05:41:22.457064 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert\") pod \"infra-operator-controller-manager-57548d458d-6c7gr\" (UID: \"0ee70d2e-b283-468b-8bd8-016a120b5ae8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:22 crc kubenswrapper[4594]: E1129 05:41:22.457674 4594 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 05:41:22 crc kubenswrapper[4594]: E1129 05:41:22.457722 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert podName:0ee70d2e-b283-468b-8bd8-016a120b5ae8 nodeName:}" failed. No retries permitted until 2025-11-29 05:41:30.457709279 +0000 UTC m=+814.698218489 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert") pod "infra-operator-controller-manager-57548d458d-6c7gr" (UID: "0ee70d2e-b283-468b-8bd8-016a120b5ae8") : secret "infra-operator-webhook-server-cert" not found Nov 29 05:41:22 crc kubenswrapper[4594]: I1129 05:41:22.660473 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert\") pod \"openstack-baremetal-operator-controller-manager-6698bcb446zrvcn\" (UID: \"82d8e084-bca0-43f0-9d6f-63df84cd28a6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:22 crc kubenswrapper[4594]: E1129 05:41:22.660711 4594 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 05:41:22 crc kubenswrapper[4594]: E1129 05:41:22.660823 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert podName:82d8e084-bca0-43f0-9d6f-63df84cd28a6 nodeName:}" failed. No retries permitted until 2025-11-29 05:41:30.660794726 +0000 UTC m=+814.901303946 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert") pod "openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" (UID: "82d8e084-bca0-43f0-9d6f-63df84cd28a6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 05:41:23 crc kubenswrapper[4594]: I1129 05:41:23.169577 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:23 crc kubenswrapper[4594]: I1129 05:41:23.169866 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:23 crc kubenswrapper[4594]: E1129 05:41:23.170034 4594 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 05:41:23 crc kubenswrapper[4594]: E1129 05:41:23.170094 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs podName:505c6e79-1776-4995-a6b5-5888f75c141c nodeName:}" failed. No retries permitted until 2025-11-29 05:41:31.170080773 +0000 UTC m=+815.410589992 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs") pod "openstack-operator-controller-manager-656fd97d56-fcmzw" (UID: "505c6e79-1776-4995-a6b5-5888f75c141c") : secret "metrics-server-cert" not found Nov 29 05:41:23 crc kubenswrapper[4594]: E1129 05:41:23.170584 4594 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 05:41:23 crc kubenswrapper[4594]: E1129 05:41:23.170668 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs podName:505c6e79-1776-4995-a6b5-5888f75c141c nodeName:}" failed. No retries permitted until 2025-11-29 05:41:31.170652397 +0000 UTC m=+815.411161618 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs") pod "openstack-operator-controller-manager-656fd97d56-fcmzw" (UID: "505c6e79-1776-4995-a6b5-5888f75c141c") : secret "webhook-server-cert" not found Nov 29 05:41:26 crc kubenswrapper[4594]: I1129 05:41:26.559418 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 05:41:26 crc kubenswrapper[4594]: E1129 05:41:26.689438 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m4q5w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-dd8x6_openstack-operators(d8cfbdeb-cd3d-425e-a96c-d9a565c840c3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:26 crc kubenswrapper[4594]: E1129 05:41:26.690624 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" podUID="d8cfbdeb-cd3d-425e-a96c-d9a565c840c3" Nov 29 05:41:26 crc kubenswrapper[4594]: E1129 05:41:26.697640 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g8nnx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-mcfvv_openstack-operators(01841a60-a638-4a78-84d4-01ad474bf2fb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:26 crc kubenswrapper[4594]: E1129 05:41:26.697803 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2vgzf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6546668bfd-5bng6_openstack-operators(61fc53be-18c3-48bb-9a4a-2557df78afc7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 05:41:26 crc kubenswrapper[4594]: E1129 05:41:26.698749 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" podUID="01841a60-a638-4a78-84d4-01ad474bf2fb" Nov 29 05:41:26 crc kubenswrapper[4594]: E1129 05:41:26.698883 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" podUID="61fc53be-18c3-48bb-9a4a-2557df78afc7" Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.421063 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk" event={"ID":"a1b0453c-c84d-45ea-90be-7f01a831f987","Type":"ContainerStarted","Data":"ca76bf8f763eb1cc5347d2a66e4838d83223c85f5bc78c331e19970800a86415"} Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.432802 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5" event={"ID":"8c96e02a-d3bd-4904-8ade-baecb4c3a280","Type":"ContainerStarted","Data":"4de616f3b32eb77bfc5a513df770be0c87ada455aa0937627716ef5c534c2fc4"} Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.472833 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" event={"ID":"01841a60-a638-4a78-84d4-01ad474bf2fb","Type":"ContainerStarted","Data":"a4cce28c2495ac98720683db032b0f83bccdff815fc29cac9a2b893cd9e55cc2"} Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.473285 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" Nov 29 05:41:27 crc kubenswrapper[4594]: E1129 05:41:27.476314 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" podUID="01841a60-a638-4a78-84d4-01ad474bf2fb" Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.485021 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f" event={"ID":"e564c6c7-7145-411f-b48f-d8e2594c34a5","Type":"ContainerStarted","Data":"0f82297c5993fe895873a5395c748bba1814bc1471e0aed6c6c35cd192913b0c"} Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.486364 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-245g8" event={"ID":"f9ab855a-f938-4ad6-941a-52f4e5b7d4b2","Type":"ContainerStarted","Data":"c1d50937b675056ad043b557a3e285216155e2f55d8b255ac616954510806428"} Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.487831 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx" event={"ID":"2de2c23f-39bd-4a9d-9965-7fe280b61707","Type":"ContainerStarted","Data":"d19881d1af3fb0fc1ed8e532a21a2f7f06643bc1675f24d0c3ef7012f52ac77d"} Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.490179 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8" event={"ID":"f50eb91c-7f6c-4d0f-b32a-c9ead1766b9c","Type":"ContainerStarted","Data":"fcaa45dab0bc8813f160706dba9e5b583d2a09c8fdf27783bcc9e19ae4ed9fb3"} Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.491271 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" event={"ID":"61fc53be-18c3-48bb-9a4a-2557df78afc7","Type":"ContainerStarted","Data":"5c55c0292868fa463fa7e08e04a6c2c24eef0eb25a7af04435093eb17a95868e"} Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.491421 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" Nov 29 05:41:27 crc kubenswrapper[4594]: E1129 05:41:27.493927 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" podUID="61fc53be-18c3-48bb-9a4a-2557df78afc7" Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.494937 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9" event={"ID":"fbdf482e-3aa0-4f5c-a698-949ad6cb6992","Type":"ContainerStarted","Data":"586f99acc883917aab38a8919a2a8e4aaf231d1c3a7c1d4e1a8f0599737fd90d"} Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.496897 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn" event={"ID":"fb42365c-18e1-4456-ae16-be77a16f102c","Type":"ContainerStarted","Data":"34650e6db7268bc4d34c88e5f5b25fd6e95b85c459c11ee0842662dcc1cd8f4f"} Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.499703 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" event={"ID":"d8cfbdeb-cd3d-425e-a96c-d9a565c840c3","Type":"ContainerStarted","Data":"4763d9dead026ea25ad49649dba78fe03666516db0bdf88dcb880064221dfb8b"} Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.499950 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" Nov 29 05:41:27 crc kubenswrapper[4594]: E1129 05:41:27.500982 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" podUID="d8cfbdeb-cd3d-425e-a96c-d9a565c840c3" Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.502620 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v" event={"ID":"09d8a0a7-cc55-4654-8e59-a769c806eecf","Type":"ContainerStarted","Data":"b599fdf5281a8aea98dab44eb5e29986da64c5860215cda1a9e2f225bd26a9d3"} Nov 29 05:41:27 crc kubenswrapper[4594]: I1129 05:41:27.507148 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9" event={"ID":"95648e23-46bf-4160-9527-7ad1c84f9883","Type":"ContainerStarted","Data":"60b0e030cea85fe9330579bdfab4aad1908f74c82ab61d8b7c146a39969f3743"} Nov 29 05:41:28 crc kubenswrapper[4594]: E1129 05:41:28.535461 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" podUID="01841a60-a638-4a78-84d4-01ad474bf2fb" Nov 29 05:41:28 crc kubenswrapper[4594]: E1129 05:41:28.535461 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" podUID="d8cfbdeb-cd3d-425e-a96c-d9a565c840c3" Nov 29 05:41:28 crc kubenswrapper[4594]: E1129 05:41:28.535503 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" podUID="61fc53be-18c3-48bb-9a4a-2557df78afc7" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.541046 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert\") pod \"infra-operator-controller-manager-57548d458d-6c7gr\" (UID: \"0ee70d2e-b283-468b-8bd8-016a120b5ae8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:30 crc kubenswrapper[4594]: E1129 05:41:30.541248 4594 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 05:41:30 crc kubenswrapper[4594]: E1129 05:41:30.541813 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert podName:0ee70d2e-b283-468b-8bd8-016a120b5ae8 nodeName:}" failed. No retries permitted until 2025-11-29 05:41:46.541790948 +0000 UTC m=+830.782300168 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert") pod "infra-operator-controller-manager-57548d458d-6c7gr" (UID: "0ee70d2e-b283-468b-8bd8-016a120b5ae8") : secret "infra-operator-webhook-server-cert" not found Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.553824 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9" event={"ID":"95648e23-46bf-4160-9527-7ad1c84f9883","Type":"ContainerStarted","Data":"8b9dcd82609565b339553547dd6fad61e45e692d99d28b1658c0b6fad11d4165"} Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.554744 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.567940 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5" event={"ID":"8c96e02a-d3bd-4904-8ade-baecb4c3a280","Type":"ContainerStarted","Data":"9de7f770f53980c483e34e7656f354a61af001f61e39aafd188922a8780010c8"} Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.567997 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.579876 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9" podStartSLOduration=2.552765167 podStartE2EDuration="16.579854458s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:15.511887021 +0000 UTC m=+799.752396231" lastFinishedPulling="2025-11-29 05:41:29.538976302 +0000 UTC m=+813.779485522" observedRunningTime="2025-11-29 05:41:30.570101068 +0000 UTC m=+814.810610289" watchObservedRunningTime="2025-11-29 05:41:30.579854458 +0000 UTC m=+814.820363678" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.583148 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9" event={"ID":"fbdf482e-3aa0-4f5c-a698-949ad6cb6992","Type":"ContainerStarted","Data":"f5a60f3b735ae15900d758e60bcbcf3a32ee9188b9fb7c11f516c6886e86a19e"} Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.583404 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.592278 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn" event={"ID":"fb42365c-18e1-4456-ae16-be77a16f102c","Type":"ContainerStarted","Data":"258c2ff5dbd90673da6e0ec38efaa16d77f7c2adac97e00772b609089deb0b53"} Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.592442 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5" podStartSLOduration=2.4447375239999998 podStartE2EDuration="16.592426806s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:15.409740998 +0000 UTC m=+799.650250218" lastFinishedPulling="2025-11-29 05:41:29.55743028 +0000 UTC m=+813.797939500" observedRunningTime="2025-11-29 05:41:30.585997779 +0000 UTC m=+814.826507019" watchObservedRunningTime="2025-11-29 05:41:30.592426806 +0000 UTC m=+814.832936027" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.592530 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.596599 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f" event={"ID":"e564c6c7-7145-411f-b48f-d8e2594c34a5","Type":"ContainerStarted","Data":"e2cd727a387efa497a7f7dd66f0cca21dc2922e7d592b65e28c33999a79fb01c"} Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.596689 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.613335 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9" podStartSLOduration=2.691578333 podStartE2EDuration="16.613306837s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:15.780878459 +0000 UTC m=+800.021387679" lastFinishedPulling="2025-11-29 05:41:29.702606962 +0000 UTC m=+813.943116183" observedRunningTime="2025-11-29 05:41:30.60205975 +0000 UTC m=+814.842568970" watchObservedRunningTime="2025-11-29 05:41:30.613306837 +0000 UTC m=+814.853816057" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.615182 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-245g8" event={"ID":"f9ab855a-f938-4ad6-941a-52f4e5b7d4b2","Type":"ContainerStarted","Data":"bfbaea8d8f9a346ed77401e202c7f541f246ecc6b3efb1fc3a544531e9cd12e9"} Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.615478 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-245g8" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.624611 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8" event={"ID":"f50eb91c-7f6c-4d0f-b32a-c9ead1766b9c","Type":"ContainerStarted","Data":"a00e891de5863ce3311ebec907fa25639696902b5325e8441df17b0ad59d3f0e"} Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.624672 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.626453 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f" podStartSLOduration=2.700284373 podStartE2EDuration="16.626440121s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:15.672199442 +0000 UTC m=+799.912708662" lastFinishedPulling="2025-11-29 05:41:29.59835519 +0000 UTC m=+813.838864410" observedRunningTime="2025-11-29 05:41:30.617607534 +0000 UTC m=+814.858116754" watchObservedRunningTime="2025-11-29 05:41:30.626440121 +0000 UTC m=+814.866949342" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.628721 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk" event={"ID":"a1b0453c-c84d-45ea-90be-7f01a831f987","Type":"ContainerStarted","Data":"48b5915db2338ea5fdf42d518e33b4e399e775778887ac625e9c25c3754c43b5"} Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.629234 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.640879 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn" podStartSLOduration=2.871572103 podStartE2EDuration="16.640865586s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:15.782989278 +0000 UTC m=+800.023498498" lastFinishedPulling="2025-11-29 05:41:29.552282761 +0000 UTC m=+813.792791981" observedRunningTime="2025-11-29 05:41:30.634365915 +0000 UTC m=+814.874875145" watchObservedRunningTime="2025-11-29 05:41:30.640865586 +0000 UTC m=+814.881374806" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.660969 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-245g8" podStartSLOduration=3.04725892 podStartE2EDuration="16.660952264s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:15.923062413 +0000 UTC m=+800.163571633" lastFinishedPulling="2025-11-29 05:41:29.536755756 +0000 UTC m=+813.777264977" observedRunningTime="2025-11-29 05:41:30.659351183 +0000 UTC m=+814.899860423" watchObservedRunningTime="2025-11-29 05:41:30.660952264 +0000 UTC m=+814.901461484" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.697157 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk" podStartSLOduration=2.968637862 podStartE2EDuration="16.697142481s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:15.91655653 +0000 UTC m=+800.157065751" lastFinishedPulling="2025-11-29 05:41:29.64506115 +0000 UTC m=+813.885570370" observedRunningTime="2025-11-29 05:41:30.681489209 +0000 UTC m=+814.921998430" watchObservedRunningTime="2025-11-29 05:41:30.697142481 +0000 UTC m=+814.937651701" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.697672 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8" podStartSLOduration=2.525909192 podStartE2EDuration="16.697668931s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:15.5308202 +0000 UTC m=+799.771329421" lastFinishedPulling="2025-11-29 05:41:29.702579941 +0000 UTC m=+813.943089160" observedRunningTime="2025-11-29 05:41:30.692531431 +0000 UTC m=+814.933040651" watchObservedRunningTime="2025-11-29 05:41:30.697668931 +0000 UTC m=+814.938178151" Nov 29 05:41:30 crc kubenswrapper[4594]: I1129 05:41:30.749834 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert\") pod \"openstack-baremetal-operator-controller-manager-6698bcb446zrvcn\" (UID: \"82d8e084-bca0-43f0-9d6f-63df84cd28a6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:30 crc kubenswrapper[4594]: E1129 05:41:30.750012 4594 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 05:41:30 crc kubenswrapper[4594]: E1129 05:41:30.750082 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert podName:82d8e084-bca0-43f0-9d6f-63df84cd28a6 nodeName:}" failed. No retries permitted until 2025-11-29 05:41:46.750066543 +0000 UTC m=+830.990575763 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert") pod "openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" (UID: "82d8e084-bca0-43f0-9d6f-63df84cd28a6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 05:41:31 crc kubenswrapper[4594]: I1129 05:41:31.258210 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:31 crc kubenswrapper[4594]: I1129 05:41:31.258332 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:31 crc kubenswrapper[4594]: E1129 05:41:31.258420 4594 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 05:41:31 crc kubenswrapper[4594]: E1129 05:41:31.258508 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs podName:505c6e79-1776-4995-a6b5-5888f75c141c nodeName:}" failed. No retries permitted until 2025-11-29 05:41:47.258490569 +0000 UTC m=+831.498999799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs") pod "openstack-operator-controller-manager-656fd97d56-fcmzw" (UID: "505c6e79-1776-4995-a6b5-5888f75c141c") : secret "webhook-server-cert" not found Nov 29 05:41:31 crc kubenswrapper[4594]: I1129 05:41:31.269681 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-metrics-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:31 crc kubenswrapper[4594]: I1129 05:41:31.642366 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-s254f" Nov 29 05:41:31 crc kubenswrapper[4594]: I1129 05:41:31.642719 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qgzzn" Nov 29 05:41:31 crc kubenswrapper[4594]: I1129 05:41:31.644196 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48tfk" Nov 29 05:41:31 crc kubenswrapper[4594]: I1129 05:41:31.645073 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g6nl9" Nov 29 05:41:31 crc kubenswrapper[4594]: I1129 05:41:31.647591 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-ppdt9" Nov 29 05:41:31 crc kubenswrapper[4594]: I1129 05:41:31.647631 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-245g8" Nov 29 05:41:31 crc kubenswrapper[4594]: I1129 05:41:31.647653 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jc4r5" Nov 29 05:41:31 crc kubenswrapper[4594]: I1129 05:41:31.647679 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-xlmc8" Nov 29 05:41:34 crc kubenswrapper[4594]: I1129 05:41:34.905137 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.087615 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.266569 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.666962 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r" event={"ID":"3fdff03c-acc7-4274-bb06-83abd0f7b432","Type":"ContainerStarted","Data":"1b4173f33516c64eeee8a62046ac9e8042a7d8e6629524ea681521f629fdccbb"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.670057 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" event={"ID":"01841a60-a638-4a78-84d4-01ad474bf2fb","Type":"ContainerStarted","Data":"b1fd6feea87e76fb87e689bd5f435b5fc91e1414b047038c05284397dc1df34e"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.672780 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" event={"ID":"d8cfbdeb-cd3d-425e-a96c-d9a565c840c3","Type":"ContainerStarted","Data":"666e449201676b64730abd11fadcd502cd2a8f5f33b9ac4e3193b01192ff597f"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.675601 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx" event={"ID":"2de2c23f-39bd-4a9d-9965-7fe280b61707","Type":"ContainerStarted","Data":"cc9fb6343b1e3e9c6c080070220c5a218a75ffde16b61c23213fef4aea1d16f0"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.675775 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.679034 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.684770 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4mk5r" podStartSLOduration=2.2330659170000002 podStartE2EDuration="20.684754007s" podCreationTimestamp="2025-11-29 05:41:15 +0000 UTC" firstStartedPulling="2025-11-29 05:41:16.179139288 +0000 UTC m=+800.419648507" lastFinishedPulling="2025-11-29 05:41:34.630827377 +0000 UTC m=+818.871336597" observedRunningTime="2025-11-29 05:41:35.682687571 +0000 UTC m=+819.923196792" watchObservedRunningTime="2025-11-29 05:41:35.684754007 +0000 UTC m=+819.925263218" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.686882 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" event={"ID":"460651bc-3f62-4bc6-ab53-e791ea16993e","Type":"ContainerStarted","Data":"746a047722c18a8463088d4bb14183f58e682f9524fc8d9988d8875211e4a3ef"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.686918 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" event={"ID":"460651bc-3f62-4bc6-ab53-e791ea16993e","Type":"ContainerStarted","Data":"8df8d79f97a1c9b388a3e0e1a80ce426965329079e4c36942583c4421b32e593"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.687354 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.689856 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" event={"ID":"7d8912e8-8f81-4ea6-94c2-7e56c7726e58","Type":"ContainerStarted","Data":"ce360aa7dd205747e43b17adbaa2dc87dc36077a469ef776cbc20d7e2bba6773"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.689885 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" event={"ID":"7d8912e8-8f81-4ea6-94c2-7e56c7726e58","Type":"ContainerStarted","Data":"5da87c72345e56e0fec90d18e164ca1ae594bdf878fdc5d783c17dd914b4f741"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.690150 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.692425 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" event={"ID":"61fc53be-18c3-48bb-9a4a-2557df78afc7","Type":"ContainerStarted","Data":"e3fbe3b2639e434de6cf5165811cde5f6363719d9423ba39ea6500f15b25aa3d"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.696332 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v" event={"ID":"09d8a0a7-cc55-4654-8e59-a769c806eecf","Type":"ContainerStarted","Data":"4f1debc98ab2eb965e65f0fff6eb3ee8c0b1b1b28b0e3f03eb1e52d9fcf108a9"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.697100 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.700855 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-dd8x6" podStartSLOduration=11.433128733 podStartE2EDuration="21.700842358s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:16.016345761 +0000 UTC m=+800.256854971" lastFinishedPulling="2025-11-29 05:41:26.284059376 +0000 UTC m=+810.524568596" observedRunningTime="2025-11-29 05:41:35.692968162 +0000 UTC m=+819.933477382" watchObservedRunningTime="2025-11-29 05:41:35.700842358 +0000 UTC m=+819.941351578" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.701124 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.708373 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" event={"ID":"9e1ee1b6-8684-4fcb-a26c-fd85a950abcc","Type":"ContainerStarted","Data":"d33327e5d66e5dd0f597d374bad60618f049d846a950673adf792fe9d53f6d99"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.708422 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" event={"ID":"9e1ee1b6-8684-4fcb-a26c-fd85a950abcc","Type":"ContainerStarted","Data":"9e32e1352bd4ee6cb9a240217b4db34c7fc73904a818ce2af27d3960188f79e8"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.708679 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.708683 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mcfvv" podStartSLOduration=11.116961592 podStartE2EDuration="21.70867209s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:15.73245035 +0000 UTC m=+799.972959570" lastFinishedPulling="2025-11-29 05:41:26.324160848 +0000 UTC m=+810.564670068" observedRunningTime="2025-11-29 05:41:35.704585907 +0000 UTC m=+819.945095127" watchObservedRunningTime="2025-11-29 05:41:35.70867209 +0000 UTC m=+819.949181310" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.723187 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" event={"ID":"5cde207a-7c2e-46d9-809a-72b8749560a6","Type":"ContainerStarted","Data":"e55210dd61a44cdcb696c0265f296af43f0b34df459bbf22eb4aa62fc8749efc"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.723217 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" event={"ID":"5cde207a-7c2e-46d9-809a-72b8749560a6","Type":"ContainerStarted","Data":"7582c87843a0ceab391d2ec8e09aa5d84d63323dc1ec54822337e344ac053cc4"} Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.723418 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.733704 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-d84hx" podStartSLOduration=3.036150954 podStartE2EDuration="21.733693166s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:15.916431645 +0000 UTC m=+800.156940865" lastFinishedPulling="2025-11-29 05:41:34.613973857 +0000 UTC m=+818.854483077" observedRunningTime="2025-11-29 05:41:35.732570696 +0000 UTC m=+819.973079915" watchObservedRunningTime="2025-11-29 05:41:35.733693166 +0000 UTC m=+819.974202386" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.755798 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z9q2v" podStartSLOduration=2.898387632 podStartE2EDuration="21.7557876s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:15.773419834 +0000 UTC m=+800.013929055" lastFinishedPulling="2025-11-29 05:41:34.630819812 +0000 UTC m=+818.871329023" observedRunningTime="2025-11-29 05:41:35.750487254 +0000 UTC m=+819.990996474" watchObservedRunningTime="2025-11-29 05:41:35.7557876 +0000 UTC m=+819.996296820" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.780805 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" podStartSLOduration=3.144166333 podStartE2EDuration="21.780784681s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:16.016697132 +0000 UTC m=+800.257206352" lastFinishedPulling="2025-11-29 05:41:34.65331548 +0000 UTC m=+818.893824700" observedRunningTime="2025-11-29 05:41:35.777910558 +0000 UTC m=+820.018419778" watchObservedRunningTime="2025-11-29 05:41:35.780784681 +0000 UTC m=+820.021293891" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.798809 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5bng6" podStartSLOduration=11.233489396 podStartE2EDuration="21.798790367s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:15.719866108 +0000 UTC m=+799.960375328" lastFinishedPulling="2025-11-29 05:41:26.285167089 +0000 UTC m=+810.525676299" observedRunningTime="2025-11-29 05:41:35.794709183 +0000 UTC m=+820.035218404" watchObservedRunningTime="2025-11-29 05:41:35.798790367 +0000 UTC m=+820.039299586" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.820559 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" podStartSLOduration=3.231577609 podStartE2EDuration="21.820542577s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:16.041864454 +0000 UTC m=+800.282373673" lastFinishedPulling="2025-11-29 05:41:34.63082942 +0000 UTC m=+818.871338641" observedRunningTime="2025-11-29 05:41:35.81860316 +0000 UTC m=+820.059112381" watchObservedRunningTime="2025-11-29 05:41:35.820542577 +0000 UTC m=+820.061051797" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.848411 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" podStartSLOduration=3.269187898 podStartE2EDuration="21.848388936s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:16.068843662 +0000 UTC m=+800.309352882" lastFinishedPulling="2025-11-29 05:41:34.6480447 +0000 UTC m=+818.888553920" observedRunningTime="2025-11-29 05:41:35.847443467 +0000 UTC m=+820.087952688" watchObservedRunningTime="2025-11-29 05:41:35.848388936 +0000 UTC m=+820.088898156" Nov 29 05:41:35 crc kubenswrapper[4594]: I1129 05:41:35.848809 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" podStartSLOduration=2.266776484 podStartE2EDuration="20.848803475s" podCreationTimestamp="2025-11-29 05:41:15 +0000 UTC" firstStartedPulling="2025-11-29 05:41:16.065643004 +0000 UTC m=+800.306152214" lastFinishedPulling="2025-11-29 05:41:34.647669985 +0000 UTC m=+818.888179205" observedRunningTime="2025-11-29 05:41:35.83586154 +0000 UTC m=+820.076370761" watchObservedRunningTime="2025-11-29 05:41:35.848803475 +0000 UTC m=+820.089312695" Nov 29 05:41:37 crc kubenswrapper[4594]: I1129 05:41:37.736821 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" event={"ID":"b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0","Type":"ContainerStarted","Data":"90319280c28bc661cd8801e1c51262c05e7cd5acc41b3083c869c1e5296a6408"} Nov 29 05:41:37 crc kubenswrapper[4594]: I1129 05:41:37.737082 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" event={"ID":"b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0","Type":"ContainerStarted","Data":"760d11b4a7c953d8f1e3579fe36a0fef5310b1fec5466c555bfaf291e28f1721"} Nov 29 05:41:37 crc kubenswrapper[4594]: I1129 05:41:37.737268 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" Nov 29 05:41:37 crc kubenswrapper[4594]: I1129 05:41:37.738554 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" event={"ID":"8a5e35f1-00df-4307-b197-f7800c641af7","Type":"ContainerStarted","Data":"65b1e16d9d800b053bd9c66e83e953d50ad6f67f252a3162e461c25f76dbe789"} Nov 29 05:41:37 crc kubenswrapper[4594]: I1129 05:41:37.738586 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" event={"ID":"8a5e35f1-00df-4307-b197-f7800c641af7","Type":"ContainerStarted","Data":"ebc77f05a1234b3dd775407371b56e4d0294eb5b746cfe2a248b14333e2e4557"} Nov 29 05:41:37 crc kubenswrapper[4594]: I1129 05:41:37.765453 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" podStartSLOduration=2.626028082 podStartE2EDuration="23.765438445s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:16.03111372 +0000 UTC m=+800.271622939" lastFinishedPulling="2025-11-29 05:41:37.170524082 +0000 UTC m=+821.411033302" observedRunningTime="2025-11-29 05:41:37.764212329 +0000 UTC m=+822.004721549" watchObservedRunningTime="2025-11-29 05:41:37.765438445 +0000 UTC m=+822.005947664" Nov 29 05:41:37 crc kubenswrapper[4594]: I1129 05:41:37.785696 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" podStartSLOduration=1.650039349 podStartE2EDuration="22.785674844s" podCreationTimestamp="2025-11-29 05:41:15 +0000 UTC" firstStartedPulling="2025-11-29 05:41:16.035775044 +0000 UTC m=+800.276284263" lastFinishedPulling="2025-11-29 05:41:37.171410538 +0000 UTC m=+821.411919758" observedRunningTime="2025-11-29 05:41:37.781664102 +0000 UTC m=+822.022173323" watchObservedRunningTime="2025-11-29 05:41:37.785674844 +0000 UTC m=+822.026184064" Nov 29 05:41:45 crc kubenswrapper[4594]: I1129 05:41:45.191999 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7fmfc" Nov 29 05:41:45 crc kubenswrapper[4594]: I1129 05:41:45.247453 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-pclb6" Nov 29 05:41:45 crc kubenswrapper[4594]: I1129 05:41:45.257774 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7n6mf" Nov 29 05:41:45 crc kubenswrapper[4594]: I1129 05:41:45.280120 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zccgx" Nov 29 05:41:45 crc kubenswrapper[4594]: I1129 05:41:45.420101 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-r28h2" Nov 29 05:41:45 crc kubenswrapper[4594]: I1129 05:41:45.466424 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" Nov 29 05:41:45 crc kubenswrapper[4594]: I1129 05:41:45.483518 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6gxvv" Nov 29 05:41:45 crc kubenswrapper[4594]: I1129 05:41:45.800066 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:41:45 crc kubenswrapper[4594]: I1129 05:41:45.800124 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:41:46 crc kubenswrapper[4594]: I1129 05:41:46.590514 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert\") pod \"infra-operator-controller-manager-57548d458d-6c7gr\" (UID: \"0ee70d2e-b283-468b-8bd8-016a120b5ae8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:46 crc kubenswrapper[4594]: I1129 05:41:46.595249 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee70d2e-b283-468b-8bd8-016a120b5ae8-cert\") pod \"infra-operator-controller-manager-57548d458d-6c7gr\" (UID: \"0ee70d2e-b283-468b-8bd8-016a120b5ae8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:46 crc kubenswrapper[4594]: I1129 05:41:46.738644 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:46 crc kubenswrapper[4594]: I1129 05:41:46.792862 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert\") pod \"openstack-baremetal-operator-controller-manager-6698bcb446zrvcn\" (UID: \"82d8e084-bca0-43f0-9d6f-63df84cd28a6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:46 crc kubenswrapper[4594]: I1129 05:41:46.798405 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82d8e084-bca0-43f0-9d6f-63df84cd28a6-cert\") pod \"openstack-baremetal-operator-controller-manager-6698bcb446zrvcn\" (UID: \"82d8e084-bca0-43f0-9d6f-63df84cd28a6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:47 crc kubenswrapper[4594]: I1129 05:41:47.020305 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:47 crc kubenswrapper[4594]: I1129 05:41:47.094312 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr"] Nov 29 05:41:47 crc kubenswrapper[4594]: I1129 05:41:47.300248 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:47 crc kubenswrapper[4594]: I1129 05:41:47.303641 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/505c6e79-1776-4995-a6b5-5888f75c141c-webhook-certs\") pod \"openstack-operator-controller-manager-656fd97d56-fcmzw\" (UID: \"505c6e79-1776-4995-a6b5-5888f75c141c\") " pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:47 crc kubenswrapper[4594]: I1129 05:41:47.384940 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:47 crc kubenswrapper[4594]: I1129 05:41:47.389447 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn"] Nov 29 05:41:47 crc kubenswrapper[4594]: W1129 05:41:47.397604 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82d8e084_bca0_43f0_9d6f_63df84cd28a6.slice/crio-273128ba7009ce22329a650684f14bd17831754a18b0d1ef047b18365ff542ee WatchSource:0}: Error finding container 273128ba7009ce22329a650684f14bd17831754a18b0d1ef047b18365ff542ee: Status 404 returned error can't find the container with id 273128ba7009ce22329a650684f14bd17831754a18b0d1ef047b18365ff542ee Nov 29 05:41:47 crc kubenswrapper[4594]: I1129 05:41:47.751617 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw"] Nov 29 05:41:47 crc kubenswrapper[4594]: W1129 05:41:47.756014 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod505c6e79_1776_4995_a6b5_5888f75c141c.slice/crio-b88eb91fbffbab3806bf27df80c9903b3516e0d1d2151aab116e4b5e50feb0ef WatchSource:0}: Error finding container b88eb91fbffbab3806bf27df80c9903b3516e0d1d2151aab116e4b5e50feb0ef: Status 404 returned error can't find the container with id b88eb91fbffbab3806bf27df80c9903b3516e0d1d2151aab116e4b5e50feb0ef Nov 29 05:41:47 crc kubenswrapper[4594]: I1129 05:41:47.815776 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" event={"ID":"0ee70d2e-b283-468b-8bd8-016a120b5ae8","Type":"ContainerStarted","Data":"1bf2c97ccb559f6ad4817d1e46b8bed0453c36d43d1c79b17e89058b1c631bf1"} Nov 29 05:41:47 crc kubenswrapper[4594]: I1129 05:41:47.817282 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" event={"ID":"505c6e79-1776-4995-a6b5-5888f75c141c","Type":"ContainerStarted","Data":"b88eb91fbffbab3806bf27df80c9903b3516e0d1d2151aab116e4b5e50feb0ef"} Nov 29 05:41:47 crc kubenswrapper[4594]: I1129 05:41:47.819003 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" event={"ID":"82d8e084-bca0-43f0-9d6f-63df84cd28a6","Type":"ContainerStarted","Data":"273128ba7009ce22329a650684f14bd17831754a18b0d1ef047b18365ff542ee"} Nov 29 05:41:49 crc kubenswrapper[4594]: I1129 05:41:49.838046 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" event={"ID":"505c6e79-1776-4995-a6b5-5888f75c141c","Type":"ContainerStarted","Data":"8e0e1eb87c714148048ca22d5b5c077d3b051d42116fd537c2705525f92ed171"} Nov 29 05:41:49 crc kubenswrapper[4594]: I1129 05:41:49.838458 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:41:49 crc kubenswrapper[4594]: I1129 05:41:49.863613 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" podStartSLOduration=34.863595944 podStartE2EDuration="34.863595944s" podCreationTimestamp="2025-11-29 05:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:41:49.861513539 +0000 UTC m=+834.102022759" watchObservedRunningTime="2025-11-29 05:41:49.863595944 +0000 UTC m=+834.104105154" Nov 29 05:41:50 crc kubenswrapper[4594]: I1129 05:41:50.850692 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" event={"ID":"0ee70d2e-b283-468b-8bd8-016a120b5ae8","Type":"ContainerStarted","Data":"dcf3d19e0a57bdc3d308f84fb2266d1625749499b2e7e94c53d65054e651c34a"} Nov 29 05:41:50 crc kubenswrapper[4594]: I1129 05:41:50.851063 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" event={"ID":"0ee70d2e-b283-468b-8bd8-016a120b5ae8","Type":"ContainerStarted","Data":"5335a0367f7d46a10e75830c85a60fe3aa1c814c34094aa7819ff88b03cad53c"} Nov 29 05:41:50 crc kubenswrapper[4594]: I1129 05:41:50.872145 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" podStartSLOduration=33.610610595 podStartE2EDuration="36.872130836s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:47.107533411 +0000 UTC m=+831.348042632" lastFinishedPulling="2025-11-29 05:41:50.369053653 +0000 UTC m=+834.609562873" observedRunningTime="2025-11-29 05:41:50.866099456 +0000 UTC m=+835.106608676" watchObservedRunningTime="2025-11-29 05:41:50.872130836 +0000 UTC m=+835.112640046" Nov 29 05:41:51 crc kubenswrapper[4594]: I1129 05:41:51.859214 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" event={"ID":"82d8e084-bca0-43f0-9d6f-63df84cd28a6","Type":"ContainerStarted","Data":"b3b4881340ff0f357df2f0fb5633cd5a1f62749a0b5ac9e992e52331784472e4"} Nov 29 05:41:51 crc kubenswrapper[4594]: I1129 05:41:51.859830 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" event={"ID":"82d8e084-bca0-43f0-9d6f-63df84cd28a6","Type":"ContainerStarted","Data":"391642fb8afc1287ba77911fca305f0b90f7a2fc116a43e3dc06cc0a6c7c3c4f"} Nov 29 05:41:51 crc kubenswrapper[4594]: I1129 05:41:51.859891 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:51 crc kubenswrapper[4594]: I1129 05:41:51.859906 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:51 crc kubenswrapper[4594]: I1129 05:41:51.882432 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" podStartSLOduration=34.24877039 podStartE2EDuration="37.882414877s" podCreationTimestamp="2025-11-29 05:41:14 +0000 UTC" firstStartedPulling="2025-11-29 05:41:47.401066064 +0000 UTC m=+831.641575283" lastFinishedPulling="2025-11-29 05:41:51.03471055 +0000 UTC m=+835.275219770" observedRunningTime="2025-11-29 05:41:51.881141312 +0000 UTC m=+836.121650532" watchObservedRunningTime="2025-11-29 05:41:51.882414877 +0000 UTC m=+836.122924097" Nov 29 05:41:56 crc kubenswrapper[4594]: I1129 05:41:56.743886 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-6c7gr" Nov 29 05:41:57 crc kubenswrapper[4594]: I1129 05:41:57.025245 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6698bcb446zrvcn" Nov 29 05:41:57 crc kubenswrapper[4594]: I1129 05:41:57.390376 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-656fd97d56-fcmzw" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.542643 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-h4dvz"] Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.544195 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-h4dvz" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.549068 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8jnpm" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.549085 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.549603 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.550632 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.603858 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-h4dvz"] Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.610474 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-7c8l9"] Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.611843 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.617169 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.626548 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-7c8l9"] Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.732628 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t5dc\" (UniqueName: \"kubernetes.io/projected/fb896c84-cf45-4033-b81c-456b230c2334-kube-api-access-7t5dc\") pod \"dnsmasq-dns-8468885bfc-h4dvz\" (UID: \"fb896c84-cf45-4033-b81c-456b230c2334\") " pod="openstack/dnsmasq-dns-8468885bfc-h4dvz" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.732795 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb896c84-cf45-4033-b81c-456b230c2334-config\") pod \"dnsmasq-dns-8468885bfc-h4dvz\" (UID: \"fb896c84-cf45-4033-b81c-456b230c2334\") " pod="openstack/dnsmasq-dns-8468885bfc-h4dvz" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.732828 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887a7316-5262-4580-afea-bfeafc172e70-dns-svc\") pod \"dnsmasq-dns-545d49fd5c-7c8l9\" (UID: \"887a7316-5262-4580-afea-bfeafc172e70\") " pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.733046 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nzr8\" (UniqueName: \"kubernetes.io/projected/887a7316-5262-4580-afea-bfeafc172e70-kube-api-access-9nzr8\") pod \"dnsmasq-dns-545d49fd5c-7c8l9\" (UID: \"887a7316-5262-4580-afea-bfeafc172e70\") " pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.733208 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887a7316-5262-4580-afea-bfeafc172e70-config\") pod \"dnsmasq-dns-545d49fd5c-7c8l9\" (UID: \"887a7316-5262-4580-afea-bfeafc172e70\") " pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.834537 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nzr8\" (UniqueName: \"kubernetes.io/projected/887a7316-5262-4580-afea-bfeafc172e70-kube-api-access-9nzr8\") pod \"dnsmasq-dns-545d49fd5c-7c8l9\" (UID: \"887a7316-5262-4580-afea-bfeafc172e70\") " pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.834598 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887a7316-5262-4580-afea-bfeafc172e70-config\") pod \"dnsmasq-dns-545d49fd5c-7c8l9\" (UID: \"887a7316-5262-4580-afea-bfeafc172e70\") " pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.834670 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t5dc\" (UniqueName: \"kubernetes.io/projected/fb896c84-cf45-4033-b81c-456b230c2334-kube-api-access-7t5dc\") pod \"dnsmasq-dns-8468885bfc-h4dvz\" (UID: \"fb896c84-cf45-4033-b81c-456b230c2334\") " pod="openstack/dnsmasq-dns-8468885bfc-h4dvz" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.834709 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb896c84-cf45-4033-b81c-456b230c2334-config\") pod \"dnsmasq-dns-8468885bfc-h4dvz\" (UID: \"fb896c84-cf45-4033-b81c-456b230c2334\") " pod="openstack/dnsmasq-dns-8468885bfc-h4dvz" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.834734 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887a7316-5262-4580-afea-bfeafc172e70-dns-svc\") pod \"dnsmasq-dns-545d49fd5c-7c8l9\" (UID: \"887a7316-5262-4580-afea-bfeafc172e70\") " pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.835599 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb896c84-cf45-4033-b81c-456b230c2334-config\") pod \"dnsmasq-dns-8468885bfc-h4dvz\" (UID: \"fb896c84-cf45-4033-b81c-456b230c2334\") " pod="openstack/dnsmasq-dns-8468885bfc-h4dvz" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.835613 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887a7316-5262-4580-afea-bfeafc172e70-dns-svc\") pod \"dnsmasq-dns-545d49fd5c-7c8l9\" (UID: \"887a7316-5262-4580-afea-bfeafc172e70\") " pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.835616 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887a7316-5262-4580-afea-bfeafc172e70-config\") pod \"dnsmasq-dns-545d49fd5c-7c8l9\" (UID: \"887a7316-5262-4580-afea-bfeafc172e70\") " pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.852839 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t5dc\" (UniqueName: \"kubernetes.io/projected/fb896c84-cf45-4033-b81c-456b230c2334-kube-api-access-7t5dc\") pod \"dnsmasq-dns-8468885bfc-h4dvz\" (UID: \"fb896c84-cf45-4033-b81c-456b230c2334\") " pod="openstack/dnsmasq-dns-8468885bfc-h4dvz" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.853643 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nzr8\" (UniqueName: \"kubernetes.io/projected/887a7316-5262-4580-afea-bfeafc172e70-kube-api-access-9nzr8\") pod \"dnsmasq-dns-545d49fd5c-7c8l9\" (UID: \"887a7316-5262-4580-afea-bfeafc172e70\") " pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.863487 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-h4dvz" Nov 29 05:42:12 crc kubenswrapper[4594]: I1129 05:42:12.953739 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" Nov 29 05:42:13 crc kubenswrapper[4594]: I1129 05:42:13.252662 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-h4dvz"] Nov 29 05:42:13 crc kubenswrapper[4594]: W1129 05:42:13.253990 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb896c84_cf45_4033_b81c_456b230c2334.slice/crio-46b06bf2829752512a4ec61f26d52426a52573082941e6da594c97ee276d9f88 WatchSource:0}: Error finding container 46b06bf2829752512a4ec61f26d52426a52573082941e6da594c97ee276d9f88: Status 404 returned error can't find the container with id 46b06bf2829752512a4ec61f26d52426a52573082941e6da594c97ee276d9f88 Nov 29 05:42:13 crc kubenswrapper[4594]: I1129 05:42:13.353857 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-7c8l9"] Nov 29 05:42:14 crc kubenswrapper[4594]: I1129 05:42:14.005630 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" event={"ID":"887a7316-5262-4580-afea-bfeafc172e70","Type":"ContainerStarted","Data":"a18978850745007e59c6422483d800ecd80381d890bc4e4c1a3e7e5328d086a3"} Nov 29 05:42:14 crc kubenswrapper[4594]: I1129 05:42:14.007067 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8468885bfc-h4dvz" event={"ID":"fb896c84-cf45-4033-b81c-456b230c2334","Type":"ContainerStarted","Data":"46b06bf2829752512a4ec61f26d52426a52573082941e6da594c97ee276d9f88"} Nov 29 05:42:15 crc kubenswrapper[4594]: I1129 05:42:15.800004 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:42:15 crc kubenswrapper[4594]: I1129 05:42:15.800412 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:42:15 crc kubenswrapper[4594]: I1129 05:42:15.800498 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:42:15 crc kubenswrapper[4594]: I1129 05:42:15.801390 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d6477f346180f42d9b7737089f66087fbd7adf1dc0feaf01168195b349570d2"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 05:42:15 crc kubenswrapper[4594]: I1129 05:42:15.801490 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://3d6477f346180f42d9b7737089f66087fbd7adf1dc0feaf01168195b349570d2" gracePeriod=600 Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.026714 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="3d6477f346180f42d9b7737089f66087fbd7adf1dc0feaf01168195b349570d2" exitCode=0 Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.026779 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"3d6477f346180f42d9b7737089f66087fbd7adf1dc0feaf01168195b349570d2"} Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.026857 4594 scope.go:117] "RemoveContainer" containerID="7993c3c5c143399d1d6e2423a11b6f37b4521819ab268ad6d3a203f1a0f238a6" Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.741126 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-7c8l9"] Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.781789 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-rkqq4"] Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.784016 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.793032 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-rkqq4"] Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.804233 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f43bd8-bee7-4801-a4c7-fd10bec175a2-config\") pod \"dnsmasq-dns-b9b4959cc-rkqq4\" (UID: \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\") " pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.804313 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80f43bd8-bee7-4801-a4c7-fd10bec175a2-dns-svc\") pod \"dnsmasq-dns-b9b4959cc-rkqq4\" (UID: \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\") " pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.804340 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmfr2\" (UniqueName: \"kubernetes.io/projected/80f43bd8-bee7-4801-a4c7-fd10bec175a2-kube-api-access-nmfr2\") pod \"dnsmasq-dns-b9b4959cc-rkqq4\" (UID: \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\") " pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.905677 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80f43bd8-bee7-4801-a4c7-fd10bec175a2-dns-svc\") pod \"dnsmasq-dns-b9b4959cc-rkqq4\" (UID: \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\") " pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.905729 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmfr2\" (UniqueName: \"kubernetes.io/projected/80f43bd8-bee7-4801-a4c7-fd10bec175a2-kube-api-access-nmfr2\") pod \"dnsmasq-dns-b9b4959cc-rkqq4\" (UID: \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\") " pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.905903 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f43bd8-bee7-4801-a4c7-fd10bec175a2-config\") pod \"dnsmasq-dns-b9b4959cc-rkqq4\" (UID: \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\") " pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.906759 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f43bd8-bee7-4801-a4c7-fd10bec175a2-config\") pod \"dnsmasq-dns-b9b4959cc-rkqq4\" (UID: \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\") " pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.907029 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80f43bd8-bee7-4801-a4c7-fd10bec175a2-dns-svc\") pod \"dnsmasq-dns-b9b4959cc-rkqq4\" (UID: \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\") " pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.962489 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmfr2\" (UniqueName: \"kubernetes.io/projected/80f43bd8-bee7-4801-a4c7-fd10bec175a2-kube-api-access-nmfr2\") pod \"dnsmasq-dns-b9b4959cc-rkqq4\" (UID: \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\") " pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" Nov 29 05:42:16 crc kubenswrapper[4594]: I1129 05:42:16.998789 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-h4dvz"] Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.026862 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-lspz6"] Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.028246 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.034147 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-lspz6"] Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.054457 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"f65325fd45322d2eca6d68603b2fa4746238184f283f293edabcb7b19e64c595"} Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.110213 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-config\") pod \"dnsmasq-dns-86b8f4ff9-lspz6\" (UID: \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\") " pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.110317 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-dns-svc\") pod \"dnsmasq-dns-86b8f4ff9-lspz6\" (UID: \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\") " pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.110353 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrvn9\" (UniqueName: \"kubernetes.io/projected/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-kube-api-access-lrvn9\") pod \"dnsmasq-dns-86b8f4ff9-lspz6\" (UID: \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\") " pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.119397 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.212461 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-config\") pod \"dnsmasq-dns-86b8f4ff9-lspz6\" (UID: \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\") " pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.212511 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-dns-svc\") pod \"dnsmasq-dns-86b8f4ff9-lspz6\" (UID: \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\") " pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.212535 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrvn9\" (UniqueName: \"kubernetes.io/projected/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-kube-api-access-lrvn9\") pod \"dnsmasq-dns-86b8f4ff9-lspz6\" (UID: \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\") " pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.214136 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-config\") pod \"dnsmasq-dns-86b8f4ff9-lspz6\" (UID: \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\") " pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.214235 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-dns-svc\") pod \"dnsmasq-dns-86b8f4ff9-lspz6\" (UID: \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\") " pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.246021 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrvn9\" (UniqueName: \"kubernetes.io/projected/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-kube-api-access-lrvn9\") pod \"dnsmasq-dns-86b8f4ff9-lspz6\" (UID: \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\") " pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.256413 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-lspz6"] Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.267580 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.284269 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5449989c59-lxq8g"] Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.298546 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5449989c59-lxq8g" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.301945 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-lxq8g"] Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.313747 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nzrw\" (UniqueName: \"kubernetes.io/projected/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-kube-api-access-9nzrw\") pod \"dnsmasq-dns-5449989c59-lxq8g\" (UID: \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\") " pod="openstack/dnsmasq-dns-5449989c59-lxq8g" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.313828 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-config\") pod \"dnsmasq-dns-5449989c59-lxq8g\" (UID: \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\") " pod="openstack/dnsmasq-dns-5449989c59-lxq8g" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.313896 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-dns-svc\") pod \"dnsmasq-dns-5449989c59-lxq8g\" (UID: \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\") " pod="openstack/dnsmasq-dns-5449989c59-lxq8g" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.415023 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nzrw\" (UniqueName: \"kubernetes.io/projected/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-kube-api-access-9nzrw\") pod \"dnsmasq-dns-5449989c59-lxq8g\" (UID: \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\") " pod="openstack/dnsmasq-dns-5449989c59-lxq8g" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.418402 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-config\") pod \"dnsmasq-dns-5449989c59-lxq8g\" (UID: \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\") " pod="openstack/dnsmasq-dns-5449989c59-lxq8g" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.418504 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-dns-svc\") pod \"dnsmasq-dns-5449989c59-lxq8g\" (UID: \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\") " pod="openstack/dnsmasq-dns-5449989c59-lxq8g" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.419448 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-dns-svc\") pod \"dnsmasq-dns-5449989c59-lxq8g\" (UID: \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\") " pod="openstack/dnsmasq-dns-5449989c59-lxq8g" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.424774 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-config\") pod \"dnsmasq-dns-5449989c59-lxq8g\" (UID: \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\") " pod="openstack/dnsmasq-dns-5449989c59-lxq8g" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.437002 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nzrw\" (UniqueName: \"kubernetes.io/projected/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-kube-api-access-9nzrw\") pod \"dnsmasq-dns-5449989c59-lxq8g\" (UID: \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\") " pod="openstack/dnsmasq-dns-5449989c59-lxq8g" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.623521 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5449989c59-lxq8g" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.806793 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-rkqq4"] Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.890498 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.891814 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.894236 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.894547 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.898605 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.898831 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.900642 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.900872 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-sxw2s" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.907349 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.911337 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-lspz6"] Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.924276 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.929124 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.929170 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.929223 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbbsd\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-kube-api-access-wbbsd\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.929242 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.929290 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.929342 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-config-data\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.929383 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ff736d8-8719-402e-95c9-1d790c1dff5e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.929406 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ff736d8-8719-402e-95c9-1d790c1dff5e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.929424 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.929469 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:17 crc kubenswrapper[4594]: I1129 05:42:17.929513 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.030342 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.030410 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-config-data\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.030465 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ff736d8-8719-402e-95c9-1d790c1dff5e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.030500 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ff736d8-8719-402e-95c9-1d790c1dff5e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.030524 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.030559 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.030603 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.030634 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.030649 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.030680 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbbsd\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-kube-api-access-wbbsd\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.030702 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.030739 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.031846 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.032139 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.032188 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-config-data\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.032243 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.032372 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.036441 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ff736d8-8719-402e-95c9-1d790c1dff5e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.037277 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ff736d8-8719-402e-95c9-1d790c1dff5e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.038818 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.039682 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.047853 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.048614 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbbsd\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-kube-api-access-wbbsd\") pod \"rabbitmq-server-0\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.064279 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" event={"ID":"c2662fc9-873c-4a27-b335-a1c9ad1e8be1","Type":"ContainerStarted","Data":"ef0439ce2ac95e1240ded9eca93f9bfe0fe0808e42018a12d645f8fc199948b3"} Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.065777 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" event={"ID":"80f43bd8-bee7-4801-a4c7-fd10bec175a2","Type":"ContainerStarted","Data":"05cab3c3ed5a193f065295ab810be9812266fc55ad00719a6c07852d91ca6e9f"} Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.133670 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.134854 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.137828 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.138063 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.138179 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.140512 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.140675 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.140800 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.142014 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nlhpq" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.157641 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.177659 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-lxq8g"] Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.230016 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.334624 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.334740 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.334768 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.334831 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/47b6950d-0d97-486d-aaec-2a9eaaf74027-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.334909 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.334952 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ck52\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-kube-api-access-6ck52\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.335034 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.335074 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.335101 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.335124 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/47b6950d-0d97-486d-aaec-2a9eaaf74027-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.335238 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.440167 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.440325 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ck52\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-kube-api-access-6ck52\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.440417 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.440483 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.440557 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.440606 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/47b6950d-0d97-486d-aaec-2a9eaaf74027-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.440726 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.440853 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.440984 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.441051 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.441173 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/47b6950d-0d97-486d-aaec-2a9eaaf74027-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.442919 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.443138 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.444008 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.445105 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.445858 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.446474 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.447276 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.449612 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/47b6950d-0d97-486d-aaec-2a9eaaf74027-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.452420 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.457103 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.457360 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-j7rlw" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.459041 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ck52\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-kube-api-access-6ck52\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.459136 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.459295 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.459558 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.459647 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.459593 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.459992 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.469387 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.474389 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/47b6950d-0d97-486d-aaec-2a9eaaf74027-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.486000 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.486950 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.628044 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 05:42:18 crc kubenswrapper[4594]: W1129 05:42:18.638246 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ff736d8_8719_402e_95c9_1d790c1dff5e.slice/crio-8ebccee727cb9f1612da3d8b110e643f95d4ae3119cfd567247515f73536dc30 WatchSource:0}: Error finding container 8ebccee727cb9f1612da3d8b110e643f95d4ae3119cfd567247515f73536dc30: Status 404 returned error can't find the container with id 8ebccee727cb9f1612da3d8b110e643f95d4ae3119cfd567247515f73536dc30 Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.645437 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9667e68c-f715-4663-bddb-53c53d3a593d-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.645551 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9667e68c-f715-4663-bddb-53c53d3a593d-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.645587 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9667e68c-f715-4663-bddb-53c53d3a593d-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.645699 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9667e68c-f715-4663-bddb-53c53d3a593d-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.645823 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.645906 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9667e68c-f715-4663-bddb-53c53d3a593d-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.645965 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9667e68c-f715-4663-bddb-53c53d3a593d-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.646048 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9667e68c-f715-4663-bddb-53c53d3a593d-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.646104 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9667e68c-f715-4663-bddb-53c53d3a593d-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.646265 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9667e68c-f715-4663-bddb-53c53d3a593d-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.646320 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2dv6\" (UniqueName: \"kubernetes.io/projected/9667e68c-f715-4663-bddb-53c53d3a593d-kube-api-access-l2dv6\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.747935 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.747997 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9667e68c-f715-4663-bddb-53c53d3a593d-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.748030 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9667e68c-f715-4663-bddb-53c53d3a593d-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.748059 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9667e68c-f715-4663-bddb-53c53d3a593d-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.748090 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9667e68c-f715-4663-bddb-53c53d3a593d-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.748137 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9667e68c-f715-4663-bddb-53c53d3a593d-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.748162 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2dv6\" (UniqueName: \"kubernetes.io/projected/9667e68c-f715-4663-bddb-53c53d3a593d-kube-api-access-l2dv6\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.748201 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9667e68c-f715-4663-bddb-53c53d3a593d-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.748235 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9667e68c-f715-4663-bddb-53c53d3a593d-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.748277 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9667e68c-f715-4663-bddb-53c53d3a593d-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.748300 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9667e68c-f715-4663-bddb-53c53d3a593d-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.749440 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9667e68c-f715-4663-bddb-53c53d3a593d-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.749682 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.750187 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9667e68c-f715-4663-bddb-53c53d3a593d-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.751090 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9667e68c-f715-4663-bddb-53c53d3a593d-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.751688 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9667e68c-f715-4663-bddb-53c53d3a593d-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.751216 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9667e68c-f715-4663-bddb-53c53d3a593d-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.756804 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9667e68c-f715-4663-bddb-53c53d3a593d-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.757135 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9667e68c-f715-4663-bddb-53c53d3a593d-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.761646 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.762846 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9667e68c-f715-4663-bddb-53c53d3a593d-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.763184 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9667e68c-f715-4663-bddb-53c53d3a593d-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.766296 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2dv6\" (UniqueName: \"kubernetes.io/projected/9667e68c-f715-4663-bddb-53c53d3a593d-kube-api-access-l2dv6\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.770883 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"9667e68c-f715-4663-bddb-53c53d3a593d\") " pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:18 crc kubenswrapper[4594]: I1129 05:42:18.780830 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:42:19 crc kubenswrapper[4594]: I1129 05:42:19.089180 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1ff736d8-8719-402e-95c9-1d790c1dff5e","Type":"ContainerStarted","Data":"8ebccee727cb9f1612da3d8b110e643f95d4ae3119cfd567247515f73536dc30"} Nov 29 05:42:19 crc kubenswrapper[4594]: I1129 05:42:19.092709 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5449989c59-lxq8g" event={"ID":"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c","Type":"ContainerStarted","Data":"5f2bc09c6221ef405dadcf20802ee288451e2059ec2bbfe85ad0a4be49591298"} Nov 29 05:42:19 crc kubenswrapper[4594]: I1129 05:42:19.219066 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 05:42:19 crc kubenswrapper[4594]: I1129 05:42:19.276456 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Nov 29 05:42:19 crc kubenswrapper[4594]: W1129 05:42:19.291789 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9667e68c_f715_4663_bddb_53c53d3a593d.slice/crio-119eab1b44c9e653bceec3280b27e15cdc23653698d70d434819261c9fab0dff WatchSource:0}: Error finding container 119eab1b44c9e653bceec3280b27e15cdc23653698d70d434819261c9fab0dff: Status 404 returned error can't find the container with id 119eab1b44c9e653bceec3280b27e15cdc23653698d70d434819261c9fab0dff Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.102372 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"47b6950d-0d97-486d-aaec-2a9eaaf74027","Type":"ContainerStarted","Data":"f00322f1ba8cc7234e7076580dad5ececeb7e2af45de93f2a934554a112d1ef9"} Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.103829 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"9667e68c-f715-4663-bddb-53c53d3a593d","Type":"ContainerStarted","Data":"119eab1b44c9e653bceec3280b27e15cdc23653698d70d434819261c9fab0dff"} Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.521205 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.523638 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.525226 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-fm9rs" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.526335 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.526369 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.526842 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.534602 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.545417 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.687692 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf7dae9-7fe9-4cf5-a5d0-39122397592e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.687781 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf7dae9-7fe9-4cf5-a5d0-39122397592e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.687880 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf7dae9-7fe9-4cf5-a5d0-39122397592e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.687920 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/caf7dae9-7fe9-4cf5-a5d0-39122397592e-config-data-default\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.688038 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.688067 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/caf7dae9-7fe9-4cf5-a5d0-39122397592e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.688138 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhtzx\" (UniqueName: \"kubernetes.io/projected/caf7dae9-7fe9-4cf5-a5d0-39122397592e-kube-api-access-hhtzx\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.688210 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/caf7dae9-7fe9-4cf5-a5d0-39122397592e-kolla-config\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.789752 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/caf7dae9-7fe9-4cf5-a5d0-39122397592e-config-data-default\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.789835 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.789859 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/caf7dae9-7fe9-4cf5-a5d0-39122397592e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.789880 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhtzx\" (UniqueName: \"kubernetes.io/projected/caf7dae9-7fe9-4cf5-a5d0-39122397592e-kube-api-access-hhtzx\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.789901 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/caf7dae9-7fe9-4cf5-a5d0-39122397592e-kolla-config\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.789938 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf7dae9-7fe9-4cf5-a5d0-39122397592e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.789961 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf7dae9-7fe9-4cf5-a5d0-39122397592e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.789986 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf7dae9-7fe9-4cf5-a5d0-39122397592e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.790562 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.791744 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/caf7dae9-7fe9-4cf5-a5d0-39122397592e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.792373 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/caf7dae9-7fe9-4cf5-a5d0-39122397592e-kolla-config\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.793368 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/caf7dae9-7fe9-4cf5-a5d0-39122397592e-config-data-default\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.799970 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf7dae9-7fe9-4cf5-a5d0-39122397592e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.803767 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf7dae9-7fe9-4cf5-a5d0-39122397592e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.804442 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhtzx\" (UniqueName: \"kubernetes.io/projected/caf7dae9-7fe9-4cf5-a5d0-39122397592e-kube-api-access-hhtzx\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.805726 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf7dae9-7fe9-4cf5-a5d0-39122397592e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.845177 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"caf7dae9-7fe9-4cf5-a5d0-39122397592e\") " pod="openstack/openstack-galera-0" Nov 29 05:42:20 crc kubenswrapper[4594]: I1129 05:42:20.862127 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.293197 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.779634 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.781429 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.785006 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mm9g8" Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.785197 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.785553 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.785588 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.785620 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.916794 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.916841 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1ab0ecb8-fc35-4934-b62a-6912d56e9001-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.918462 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab0ecb8-fc35-4934-b62a-6912d56e9001-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.918519 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1ab0ecb8-fc35-4934-b62a-6912d56e9001-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.918551 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ab0ecb8-fc35-4934-b62a-6912d56e9001-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.918578 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1ab0ecb8-fc35-4934-b62a-6912d56e9001-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.918610 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab0ecb8-fc35-4934-b62a-6912d56e9001-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:21 crc kubenswrapper[4594]: I1129 05:42:21.918663 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmbz2\" (UniqueName: \"kubernetes.io/projected/1ab0ecb8-fc35-4934-b62a-6912d56e9001-kube-api-access-qmbz2\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.022029 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1ab0ecb8-fc35-4934-b62a-6912d56e9001-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.022101 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ab0ecb8-fc35-4934-b62a-6912d56e9001-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.022142 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1ab0ecb8-fc35-4934-b62a-6912d56e9001-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.022197 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab0ecb8-fc35-4934-b62a-6912d56e9001-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.022242 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmbz2\" (UniqueName: \"kubernetes.io/projected/1ab0ecb8-fc35-4934-b62a-6912d56e9001-kube-api-access-qmbz2\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.023988 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1ab0ecb8-fc35-4934-b62a-6912d56e9001-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.024063 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1ab0ecb8-fc35-4934-b62a-6912d56e9001-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.024143 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.024174 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1ab0ecb8-fc35-4934-b62a-6912d56e9001-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.024364 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab0ecb8-fc35-4934-b62a-6912d56e9001-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.024616 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ab0ecb8-fc35-4934-b62a-6912d56e9001-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.026087 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.026197 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1ab0ecb8-fc35-4934-b62a-6912d56e9001-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.029886 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab0ecb8-fc35-4934-b62a-6912d56e9001-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.036230 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab0ecb8-fc35-4934-b62a-6912d56e9001-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.037702 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmbz2\" (UniqueName: \"kubernetes.io/projected/1ab0ecb8-fc35-4934-b62a-6912d56e9001-kube-api-access-qmbz2\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.055332 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1ab0ecb8-fc35-4934-b62a-6912d56e9001\") " pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.110929 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.114202 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.114557 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.117699 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.118651 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.118845 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.119231 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-bkb79" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.140813 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"caf7dae9-7fe9-4cf5-a5d0-39122397592e","Type":"ContainerStarted","Data":"dbbd522cd47fdbee558918ead087cbb48212bb75d8de82ab44caa8808034ba92"} Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.227464 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9952b1d2-2cce-45f4-b370-5d0107f80260-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.227561 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9952b1d2-2cce-45f4-b370-5d0107f80260-config-data\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.227612 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9952b1d2-2cce-45f4-b370-5d0107f80260-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.227636 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vmhz\" (UniqueName: \"kubernetes.io/projected/9952b1d2-2cce-45f4-b370-5d0107f80260-kube-api-access-2vmhz\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.227689 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9952b1d2-2cce-45f4-b370-5d0107f80260-kolla-config\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.328801 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9952b1d2-2cce-45f4-b370-5d0107f80260-kolla-config\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.328842 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9952b1d2-2cce-45f4-b370-5d0107f80260-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.328900 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9952b1d2-2cce-45f4-b370-5d0107f80260-config-data\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.328929 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9952b1d2-2cce-45f4-b370-5d0107f80260-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.328950 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vmhz\" (UniqueName: \"kubernetes.io/projected/9952b1d2-2cce-45f4-b370-5d0107f80260-kube-api-access-2vmhz\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.330538 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9952b1d2-2cce-45f4-b370-5d0107f80260-config-data\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.330607 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9952b1d2-2cce-45f4-b370-5d0107f80260-kolla-config\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.336034 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9952b1d2-2cce-45f4-b370-5d0107f80260-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.337275 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9952b1d2-2cce-45f4-b370-5d0107f80260-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.343092 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vmhz\" (UniqueName: \"kubernetes.io/projected/9952b1d2-2cce-45f4-b370-5d0107f80260-kube-api-access-2vmhz\") pod \"memcached-0\" (UID: \"9952b1d2-2cce-45f4-b370-5d0107f80260\") " pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.444618 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.577519 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 05:42:22 crc kubenswrapper[4594]: I1129 05:42:22.848546 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 29 05:42:22 crc kubenswrapper[4594]: W1129 05:42:22.856143 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9952b1d2_2cce_45f4_b370_5d0107f80260.slice/crio-bdb43e6cd446137bc9dd82b560c6704123090194c0d97a882e97bf5b519150df WatchSource:0}: Error finding container bdb43e6cd446137bc9dd82b560c6704123090194c0d97a882e97bf5b519150df: Status 404 returned error can't find the container with id bdb43e6cd446137bc9dd82b560c6704123090194c0d97a882e97bf5b519150df Nov 29 05:42:23 crc kubenswrapper[4594]: I1129 05:42:23.148010 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1ab0ecb8-fc35-4934-b62a-6912d56e9001","Type":"ContainerStarted","Data":"e86961f2456c7c843b5149906ff4ab853d1ceda10a14be07da963ecbb3015b50"} Nov 29 05:42:23 crc kubenswrapper[4594]: I1129 05:42:23.150009 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"9952b1d2-2cce-45f4-b370-5d0107f80260","Type":"ContainerStarted","Data":"bdb43e6cd446137bc9dd82b560c6704123090194c0d97a882e97bf5b519150df"} Nov 29 05:42:24 crc kubenswrapper[4594]: I1129 05:42:24.158404 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 05:42:24 crc kubenswrapper[4594]: I1129 05:42:24.162544 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 05:42:24 crc kubenswrapper[4594]: I1129 05:42:24.187749 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jkl4z" Nov 29 05:42:24 crc kubenswrapper[4594]: I1129 05:42:24.194286 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 05:42:24 crc kubenswrapper[4594]: I1129 05:42:24.212602 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wvrb\" (UniqueName: \"kubernetes.io/projected/9c233f96-4ed2-4e1f-b408-1b75092f366a-kube-api-access-8wvrb\") pod \"kube-state-metrics-0\" (UID: \"9c233f96-4ed2-4e1f-b408-1b75092f366a\") " pod="openstack/kube-state-metrics-0" Nov 29 05:42:24 crc kubenswrapper[4594]: I1129 05:42:24.340755 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wvrb\" (UniqueName: \"kubernetes.io/projected/9c233f96-4ed2-4e1f-b408-1b75092f366a-kube-api-access-8wvrb\") pod \"kube-state-metrics-0\" (UID: \"9c233f96-4ed2-4e1f-b408-1b75092f366a\") " pod="openstack/kube-state-metrics-0" Nov 29 05:42:24 crc kubenswrapper[4594]: I1129 05:42:24.430021 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wvrb\" (UniqueName: \"kubernetes.io/projected/9c233f96-4ed2-4e1f-b408-1b75092f366a-kube-api-access-8wvrb\") pod \"kube-state-metrics-0\" (UID: \"9c233f96-4ed2-4e1f-b408-1b75092f366a\") " pod="openstack/kube-state-metrics-0" Nov 29 05:42:24 crc kubenswrapper[4594]: I1129 05:42:24.541295 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.084625 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 05:42:25 crc kubenswrapper[4594]: W1129 05:42:25.096399 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c233f96_4ed2_4e1f_b408_1b75092f366a.slice/crio-48ab3e19937506faa5597efb437e916793575c2a0932d36d83f31c05ebfab741 WatchSource:0}: Error finding container 48ab3e19937506faa5597efb437e916793575c2a0932d36d83f31c05ebfab741: Status 404 returned error can't find the container with id 48ab3e19937506faa5597efb437e916793575c2a0932d36d83f31c05ebfab741 Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.243576 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9c233f96-4ed2-4e1f-b408-1b75092f366a","Type":"ContainerStarted","Data":"48ab3e19937506faa5597efb437e916793575c2a0932d36d83f31c05ebfab741"} Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.525210 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.530186 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.539771 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.539874 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.540240 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-w8z2f" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.542992 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.545236 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.545575 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.556841 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.569036 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4e4fe89a-3363-4724-89a8-a9b61fe6f039-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.569305 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.569333 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-config\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.569352 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4e4fe89a-3363-4724-89a8-a9b61fe6f039-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.570921 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.571081 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4e4fe89a-3363-4724-89a8-a9b61fe6f039-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.571118 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdq58\" (UniqueName: \"kubernetes.io/projected/4e4fe89a-3363-4724-89a8-a9b61fe6f039-kube-api-access-gdq58\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.571237 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.673198 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-config\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.673242 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4e4fe89a-3363-4724-89a8-a9b61fe6f039-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.673299 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.673354 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4e4fe89a-3363-4724-89a8-a9b61fe6f039-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.673376 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdq58\" (UniqueName: \"kubernetes.io/projected/4e4fe89a-3363-4724-89a8-a9b61fe6f039-kube-api-access-gdq58\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.673420 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.673467 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4e4fe89a-3363-4724-89a8-a9b61fe6f039-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.673483 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.674522 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4e4fe89a-3363-4724-89a8-a9b61fe6f039-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.678204 4594 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.678245 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/740def2cd7269ac7c6453fb83dbbc5807ffe59ef523e7a581c6b5220b8504e7f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.681198 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.682409 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4e4fe89a-3363-4724-89a8-a9b61fe6f039-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.686744 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.687015 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4e4fe89a-3363-4724-89a8-a9b61fe6f039-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.691375 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdq58\" (UniqueName: \"kubernetes.io/projected/4e4fe89a-3363-4724-89a8-a9b61fe6f039-kube-api-access-gdq58\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.700094 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-config\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.711833 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"prometheus-metric-storage-0\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:25 crc kubenswrapper[4594]: I1129 05:42:25.851231 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.509130 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-b9h9k"] Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.510927 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.519501 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-npzpz" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.520283 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.521485 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-zvkw9"] Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.521579 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.523457 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.529669 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b9h9k"] Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.534287 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zvkw9"] Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.621896 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw9ct\" (UniqueName: \"kubernetes.io/projected/0b167600-fe30-4499-876c-57685a803c45-kube-api-access-jw9ct\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.621941 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b167600-fe30-4499-876c-57685a803c45-scripts\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.621963 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svm6h\" (UniqueName: \"kubernetes.io/projected/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-kube-api-access-svm6h\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.621980 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-var-run-ovn\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.621995 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-var-log-ovn\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.622018 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-combined-ca-bundle\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.623920 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0b167600-fe30-4499-876c-57685a803c45-var-lib\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.623947 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-scripts\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.623974 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b167600-fe30-4499-876c-57685a803c45-var-run\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.624275 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0b167600-fe30-4499-876c-57685a803c45-etc-ovs\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.624312 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-var-run\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.624389 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-ovn-controller-tls-certs\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.624584 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0b167600-fe30-4499-876c-57685a803c45-var-log\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727066 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0b167600-fe30-4499-876c-57685a803c45-etc-ovs\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727182 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-var-run\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727289 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-ovn-controller-tls-certs\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727335 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0b167600-fe30-4499-876c-57685a803c45-var-log\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727428 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw9ct\" (UniqueName: \"kubernetes.io/projected/0b167600-fe30-4499-876c-57685a803c45-kube-api-access-jw9ct\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727467 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b167600-fe30-4499-876c-57685a803c45-scripts\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727496 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svm6h\" (UniqueName: \"kubernetes.io/projected/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-kube-api-access-svm6h\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727523 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-var-run-ovn\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727540 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-var-log-ovn\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727579 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-combined-ca-bundle\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727615 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0b167600-fe30-4499-876c-57685a803c45-var-lib\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727613 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0b167600-fe30-4499-876c-57685a803c45-etc-ovs\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727631 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-scripts\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.727723 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b167600-fe30-4499-876c-57685a803c45-var-run\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.728067 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b167600-fe30-4499-876c-57685a803c45-var-run\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.728247 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-var-run\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.728437 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-var-run-ovn\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.728744 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-var-log-ovn\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.728753 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0b167600-fe30-4499-876c-57685a803c45-var-lib\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.728882 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0b167600-fe30-4499-876c-57685a803c45-var-log\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.729884 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-scripts\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.730801 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b167600-fe30-4499-876c-57685a803c45-scripts\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.734578 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-combined-ca-bundle\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.736488 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-ovn-controller-tls-certs\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.742839 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw9ct\" (UniqueName: \"kubernetes.io/projected/0b167600-fe30-4499-876c-57685a803c45-kube-api-access-jw9ct\") pod \"ovn-controller-ovs-zvkw9\" (UID: \"0b167600-fe30-4499-876c-57685a803c45\") " pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.750091 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svm6h\" (UniqueName: \"kubernetes.io/projected/a7baec31-60ce-4be4-8901-a8cbe7bf7ea9-kube-api-access-svm6h\") pod \"ovn-controller-b9h9k\" (UID: \"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9\") " pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.833845 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b9h9k" Nov 29 05:42:27 crc kubenswrapper[4594]: I1129 05:42:27.843540 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.268001 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.307127 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9c233f96-4ed2-4e1f-b408-1b75092f366a","Type":"ContainerStarted","Data":"052d685992b51e74e61432ec2ec4b5534a4822ad4dfc532dacba3735a6fdb5e3"} Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.307237 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.349377 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.557237578 podStartE2EDuration="4.349360093s" podCreationTimestamp="2025-11-29 05:42:24 +0000 UTC" firstStartedPulling="2025-11-29 05:42:25.099428984 +0000 UTC m=+869.339938204" lastFinishedPulling="2025-11-29 05:42:27.891551499 +0000 UTC m=+872.132060719" observedRunningTime="2025-11-29 05:42:28.32302471 +0000 UTC m=+872.563533950" watchObservedRunningTime="2025-11-29 05:42:28.349360093 +0000 UTC m=+872.589869303" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.355017 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b9h9k"] Nov 29 05:42:28 crc kubenswrapper[4594]: W1129 05:42:28.356875 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7baec31_60ce_4be4_8901_a8cbe7bf7ea9.slice/crio-3e46fffc68bdc516e2ea6e17b13a7426094384677ad65514d8d7f2e791e4564e WatchSource:0}: Error finding container 3e46fffc68bdc516e2ea6e17b13a7426094384677ad65514d8d7f2e791e4564e: Status 404 returned error can't find the container with id 3e46fffc68bdc516e2ea6e17b13a7426094384677ad65514d8d7f2e791e4564e Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.591572 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zvkw9"] Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.628241 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-qlxwj"] Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.629652 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.631661 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.631874 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.647973 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a2d82ed9-466f-47e4-973d-0e88270f1021-ovn-rundir\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.648026 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm7qt\" (UniqueName: \"kubernetes.io/projected/a2d82ed9-466f-47e4-973d-0e88270f1021-kube-api-access-cm7qt\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.648068 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a2d82ed9-466f-47e4-973d-0e88270f1021-ovs-rundir\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.648107 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d82ed9-466f-47e4-973d-0e88270f1021-config\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.648174 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d82ed9-466f-47e4-973d-0e88270f1021-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.648206 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d82ed9-466f-47e4-973d-0e88270f1021-combined-ca-bundle\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.648278 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-qlxwj"] Nov 29 05:42:28 crc kubenswrapper[4594]: W1129 05:42:28.655050 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b167600_fe30_4499_876c_57685a803c45.slice/crio-f1b845b24d4780dcd31b9cb90d5521e3b1df9ff5d4895ebd76ab06eb12ef8724 WatchSource:0}: Error finding container f1b845b24d4780dcd31b9cb90d5521e3b1df9ff5d4895ebd76ab06eb12ef8724: Status 404 returned error can't find the container with id f1b845b24d4780dcd31b9cb90d5521e3b1df9ff5d4895ebd76ab06eb12ef8724 Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.748909 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a2d82ed9-466f-47e4-973d-0e88270f1021-ovn-rundir\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.748996 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm7qt\" (UniqueName: \"kubernetes.io/projected/a2d82ed9-466f-47e4-973d-0e88270f1021-kube-api-access-cm7qt\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.749034 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a2d82ed9-466f-47e4-973d-0e88270f1021-ovs-rundir\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.749068 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d82ed9-466f-47e4-973d-0e88270f1021-config\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.749142 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d82ed9-466f-47e4-973d-0e88270f1021-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.749166 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d82ed9-466f-47e4-973d-0e88270f1021-combined-ca-bundle\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.749244 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a2d82ed9-466f-47e4-973d-0e88270f1021-ovn-rundir\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.749529 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a2d82ed9-466f-47e4-973d-0e88270f1021-ovs-rundir\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.750047 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d82ed9-466f-47e4-973d-0e88270f1021-config\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.755849 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2d82ed9-466f-47e4-973d-0e88270f1021-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.756383 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d82ed9-466f-47e4-973d-0e88270f1021-combined-ca-bundle\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.767888 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm7qt\" (UniqueName: \"kubernetes.io/projected/a2d82ed9-466f-47e4-973d-0e88270f1021-kube-api-access-cm7qt\") pod \"ovn-controller-metrics-qlxwj\" (UID: \"a2d82ed9-466f-47e4-973d-0e88270f1021\") " pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:28 crc kubenswrapper[4594]: I1129 05:42:28.948755 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qlxwj" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.342025 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4e4fe89a-3363-4724-89a8-a9b61fe6f039","Type":"ContainerStarted","Data":"3f31231729b0a8d86e298a73e5b59f8c5108af4c552716077e571032c6d1d3ab"} Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.346696 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zvkw9" event={"ID":"0b167600-fe30-4499-876c-57685a803c45","Type":"ContainerStarted","Data":"f1b845b24d4780dcd31b9cb90d5521e3b1df9ff5d4895ebd76ab06eb12ef8724"} Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.350088 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b9h9k" event={"ID":"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9","Type":"ContainerStarted","Data":"3e46fffc68bdc516e2ea6e17b13a7426094384677ad65514d8d7f2e791e4564e"} Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.466625 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-qlxwj"] Nov 29 05:42:29 crc kubenswrapper[4594]: W1129 05:42:29.476794 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2d82ed9_466f_47e4_973d_0e88270f1021.slice/crio-d04c581955982aeb1502dfba37d2d5e88d42e6719f6f2574d716bf8974b32105 WatchSource:0}: Error finding container d04c581955982aeb1502dfba37d2d5e88d42e6719f6f2574d716bf8974b32105: Status 404 returned error can't find the container with id d04c581955982aeb1502dfba37d2d5e88d42e6719f6f2574d716bf8974b32105 Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.758235 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.760839 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.762155 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.763168 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.763247 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.763294 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-7h48v" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.764398 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.876244 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/31fad155-3970-4a5d-a357-e96fa27bbb54-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.876307 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hs7h\" (UniqueName: \"kubernetes.io/projected/31fad155-3970-4a5d-a357-e96fa27bbb54-kube-api-access-6hs7h\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.876367 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/31fad155-3970-4a5d-a357-e96fa27bbb54-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.876399 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31fad155-3970-4a5d-a357-e96fa27bbb54-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.876429 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31fad155-3970-4a5d-a357-e96fa27bbb54-config\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.876459 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/31fad155-3970-4a5d-a357-e96fa27bbb54-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.877411 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31fad155-3970-4a5d-a357-e96fa27bbb54-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.877647 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.979030 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/31fad155-3970-4a5d-a357-e96fa27bbb54-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.979070 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hs7h\" (UniqueName: \"kubernetes.io/projected/31fad155-3970-4a5d-a357-e96fa27bbb54-kube-api-access-6hs7h\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.979125 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/31fad155-3970-4a5d-a357-e96fa27bbb54-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.979152 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31fad155-3970-4a5d-a357-e96fa27bbb54-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.979180 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31fad155-3970-4a5d-a357-e96fa27bbb54-config\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.979198 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/31fad155-3970-4a5d-a357-e96fa27bbb54-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.979225 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31fad155-3970-4a5d-a357-e96fa27bbb54-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.979276 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.979618 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.982884 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/31fad155-3970-4a5d-a357-e96fa27bbb54-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.986142 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31fad155-3970-4a5d-a357-e96fa27bbb54-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.989243 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31fad155-3970-4a5d-a357-e96fa27bbb54-config\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.994420 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31fad155-3970-4a5d-a357-e96fa27bbb54-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.994597 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/31fad155-3970-4a5d-a357-e96fa27bbb54-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:29 crc kubenswrapper[4594]: I1129 05:42:29.995900 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/31fad155-3970-4a5d-a357-e96fa27bbb54-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:30 crc kubenswrapper[4594]: I1129 05:42:29.999585 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:30 crc kubenswrapper[4594]: I1129 05:42:30.005038 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hs7h\" (UniqueName: \"kubernetes.io/projected/31fad155-3970-4a5d-a357-e96fa27bbb54-kube-api-access-6hs7h\") pod \"ovsdbserver-nb-0\" (UID: \"31fad155-3970-4a5d-a357-e96fa27bbb54\") " pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:30 crc kubenswrapper[4594]: I1129 05:42:30.085963 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 29 05:42:30 crc kubenswrapper[4594]: I1129 05:42:30.364117 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qlxwj" event={"ID":"a2d82ed9-466f-47e4-973d-0e88270f1021","Type":"ContainerStarted","Data":"d04c581955982aeb1502dfba37d2d5e88d42e6719f6f2574d716bf8974b32105"} Nov 29 05:42:30 crc kubenswrapper[4594]: I1129 05:42:30.643857 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 05:42:30 crc kubenswrapper[4594]: W1129 05:42:30.647462 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31fad155_3970_4a5d_a357_e96fa27bbb54.slice/crio-5e603966a703de7525cf970fb0c6213be61e3a047853df4e7145d71d17d1721c WatchSource:0}: Error finding container 5e603966a703de7525cf970fb0c6213be61e3a047853df4e7145d71d17d1721c: Status 404 returned error can't find the container with id 5e603966a703de7525cf970fb0c6213be61e3a047853df4e7145d71d17d1721c Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.390829 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"31fad155-3970-4a5d-a357-e96fa27bbb54","Type":"ContainerStarted","Data":"5e603966a703de7525cf970fb0c6213be61e3a047853df4e7145d71d17d1721c"} Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.495883 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-rkqq4"] Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.516206 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fb75c485f-xvpfr"] Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.517751 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.519593 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.533307 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fb75c485f-xvpfr"] Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.615930 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-config\") pod \"dnsmasq-dns-6fb75c485f-xvpfr\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.615974 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-ovsdbserver-nb\") pod \"dnsmasq-dns-6fb75c485f-xvpfr\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.615997 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-dns-svc\") pod \"dnsmasq-dns-6fb75c485f-xvpfr\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.616101 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24qt6\" (UniqueName: \"kubernetes.io/projected/27bd40a9-3181-42a6-8141-55078612c429-kube-api-access-24qt6\") pod \"dnsmasq-dns-6fb75c485f-xvpfr\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.717737 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-config\") pod \"dnsmasq-dns-6fb75c485f-xvpfr\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.717779 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-ovsdbserver-nb\") pod \"dnsmasq-dns-6fb75c485f-xvpfr\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.717795 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-dns-svc\") pod \"dnsmasq-dns-6fb75c485f-xvpfr\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.717815 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24qt6\" (UniqueName: \"kubernetes.io/projected/27bd40a9-3181-42a6-8141-55078612c429-kube-api-access-24qt6\") pod \"dnsmasq-dns-6fb75c485f-xvpfr\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.718908 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-config\") pod \"dnsmasq-dns-6fb75c485f-xvpfr\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.719415 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-ovsdbserver-nb\") pod \"dnsmasq-dns-6fb75c485f-xvpfr\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.719890 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-dns-svc\") pod \"dnsmasq-dns-6fb75c485f-xvpfr\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.752186 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24qt6\" (UniqueName: \"kubernetes.io/projected/27bd40a9-3181-42a6-8141-55078612c429-kube-api-access-24qt6\") pod \"dnsmasq-dns-6fb75c485f-xvpfr\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:31 crc kubenswrapper[4594]: I1129 05:42:31.851351 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.017109 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.018977 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.023641 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-tjt66" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.023704 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.023923 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.024674 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.025182 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.125690 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5075e0-210c-455f-8203-3dde7c7be5eb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.125762 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f5075e0-210c-455f-8203-3dde7c7be5eb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.125868 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5075e0-210c-455f-8203-3dde7c7be5eb-config\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.125891 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f5075e0-210c-455f-8203-3dde7c7be5eb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.125934 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4vs4\" (UniqueName: \"kubernetes.io/projected/4f5075e0-210c-455f-8203-3dde7c7be5eb-kube-api-access-n4vs4\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.125963 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5075e0-210c-455f-8203-3dde7c7be5eb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.125984 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5075e0-210c-455f-8203-3dde7c7be5eb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.126023 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.229015 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f5075e0-210c-455f-8203-3dde7c7be5eb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.229120 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5075e0-210c-455f-8203-3dde7c7be5eb-config\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.229142 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f5075e0-210c-455f-8203-3dde7c7be5eb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.229169 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4vs4\" (UniqueName: \"kubernetes.io/projected/4f5075e0-210c-455f-8203-3dde7c7be5eb-kube-api-access-n4vs4\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.229198 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5075e0-210c-455f-8203-3dde7c7be5eb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.229217 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5075e0-210c-455f-8203-3dde7c7be5eb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.229268 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.229307 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5075e0-210c-455f-8203-3dde7c7be5eb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.230757 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f5075e0-210c-455f-8203-3dde7c7be5eb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.231350 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5075e0-210c-455f-8203-3dde7c7be5eb-config\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.231607 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f5075e0-210c-455f-8203-3dde7c7be5eb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.231800 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.236935 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5075e0-210c-455f-8203-3dde7c7be5eb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.240194 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5075e0-210c-455f-8203-3dde7c7be5eb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.241137 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5075e0-210c-455f-8203-3dde7c7be5eb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.245135 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4vs4\" (UniqueName: \"kubernetes.io/projected/4f5075e0-210c-455f-8203-3dde7c7be5eb-kube-api-access-n4vs4\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.254618 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4f5075e0-210c-455f-8203-3dde7c7be5eb\") " pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.362234 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fb75c485f-xvpfr"] Nov 29 05:42:32 crc kubenswrapper[4594]: W1129 05:42:32.383540 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27bd40a9_3181_42a6_8141_55078612c429.slice/crio-2e9faab66b8856e22c01b346ed52ad4084e1b7a0a8b0cccd85b27c66e9268449 WatchSource:0}: Error finding container 2e9faab66b8856e22c01b346ed52ad4084e1b7a0a8b0cccd85b27c66e9268449: Status 404 returned error can't find the container with id 2e9faab66b8856e22c01b346ed52ad4084e1b7a0a8b0cccd85b27c66e9268449 Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.388891 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.404233 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" event={"ID":"27bd40a9-3181-42a6-8141-55078612c429","Type":"ContainerStarted","Data":"2e9faab66b8856e22c01b346ed52ad4084e1b7a0a8b0cccd85b27c66e9268449"} Nov 29 05:42:32 crc kubenswrapper[4594]: I1129 05:42:32.901573 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 05:42:33 crc kubenswrapper[4594]: I1129 05:42:33.459918 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4f5075e0-210c-455f-8203-3dde7c7be5eb","Type":"ContainerStarted","Data":"9a31c9667b7825da9d08b798f2f717ac968b2fb832dca468661cc109aa4bb9e8"} Nov 29 05:42:34 crc kubenswrapper[4594]: I1129 05:42:34.546627 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 29 05:42:35 crc kubenswrapper[4594]: I1129 05:42:35.478935 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4e4fe89a-3363-4724-89a8-a9b61fe6f039","Type":"ContainerStarted","Data":"a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e"} Nov 29 05:42:36 crc kubenswrapper[4594]: I1129 05:42:36.512968 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qlxwj" event={"ID":"a2d82ed9-466f-47e4-973d-0e88270f1021","Type":"ContainerStarted","Data":"98c0ac8b0d04c811df6221dd319a5676e365b44f4db6e6c9167119ac098d9f05"} Nov 29 05:42:36 crc kubenswrapper[4594]: I1129 05:42:36.529948 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-qlxwj" podStartSLOduration=2.270826586 podStartE2EDuration="8.529933195s" podCreationTimestamp="2025-11-29 05:42:28 +0000 UTC" firstStartedPulling="2025-11-29 05:42:29.48086393 +0000 UTC m=+873.721373150" lastFinishedPulling="2025-11-29 05:42:35.739970538 +0000 UTC m=+879.980479759" observedRunningTime="2025-11-29 05:42:36.526040509 +0000 UTC m=+880.766549730" watchObservedRunningTime="2025-11-29 05:42:36.529933195 +0000 UTC m=+880.770442415" Nov 29 05:42:36 crc kubenswrapper[4594]: I1129 05:42:36.859545 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-lxq8g"] Nov 29 05:42:36 crc kubenswrapper[4594]: I1129 05:42:36.898602 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dbf544cc9-vr7qn"] Nov 29 05:42:36 crc kubenswrapper[4594]: I1129 05:42:36.900309 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:36 crc kubenswrapper[4594]: I1129 05:42:36.902407 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 29 05:42:36 crc kubenswrapper[4594]: I1129 05:42:36.914923 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dbf544cc9-vr7qn"] Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.061597 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgw9h\" (UniqueName: \"kubernetes.io/projected/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-kube-api-access-lgw9h\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.061671 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.061759 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.061833 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-dns-svc\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.061941 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-config\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.163815 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.163891 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-dns-svc\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.163958 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-config\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.164005 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgw9h\" (UniqueName: \"kubernetes.io/projected/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-kube-api-access-lgw9h\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.164039 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.165041 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.165698 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.166535 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-dns-svc\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.167864 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-config\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.192894 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgw9h\" (UniqueName: \"kubernetes.io/projected/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-kube-api-access-lgw9h\") pod \"dnsmasq-dns-6dbf544cc9-vr7qn\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.227474 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:42:37 crc kubenswrapper[4594]: I1129 05:42:37.649563 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dbf544cc9-vr7qn"] Nov 29 05:42:38 crc kubenswrapper[4594]: I1129 05:42:38.535579 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" event={"ID":"bcfcc5d2-2a01-409c-9479-5c3df1f9a319","Type":"ContainerStarted","Data":"84625ade04012652264528f4462eaae297c7518575296dbe822cef7fbf14a1f8"} Nov 29 05:42:41 crc kubenswrapper[4594]: I1129 05:42:41.592681 4594 generic.go:334] "Generic (PLEG): container finished" podID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerID="a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e" exitCode=0 Nov 29 05:42:41 crc kubenswrapper[4594]: I1129 05:42:41.592762 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4e4fe89a-3363-4724-89a8-a9b61fe6f039","Type":"ContainerDied","Data":"a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e"} Nov 29 05:42:57 crc kubenswrapper[4594]: I1129 05:42:57.028712 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zwhtm"] Nov 29 05:42:57 crc kubenswrapper[4594]: I1129 05:42:57.031469 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:42:57 crc kubenswrapper[4594]: I1129 05:42:57.036898 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwhtm"] Nov 29 05:42:57 crc kubenswrapper[4594]: I1129 05:42:57.180858 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f40ab58-b977-4f3d-a122-691a05dd14cf-utilities\") pod \"community-operators-zwhtm\" (UID: \"0f40ab58-b977-4f3d-a122-691a05dd14cf\") " pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:42:57 crc kubenswrapper[4594]: I1129 05:42:57.180923 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f40ab58-b977-4f3d-a122-691a05dd14cf-catalog-content\") pod \"community-operators-zwhtm\" (UID: \"0f40ab58-b977-4f3d-a122-691a05dd14cf\") " pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:42:57 crc kubenswrapper[4594]: I1129 05:42:57.180953 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8h27\" (UniqueName: \"kubernetes.io/projected/0f40ab58-b977-4f3d-a122-691a05dd14cf-kube-api-access-b8h27\") pod \"community-operators-zwhtm\" (UID: \"0f40ab58-b977-4f3d-a122-691a05dd14cf\") " pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:42:57 crc kubenswrapper[4594]: I1129 05:42:57.283863 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f40ab58-b977-4f3d-a122-691a05dd14cf-utilities\") pod \"community-operators-zwhtm\" (UID: \"0f40ab58-b977-4f3d-a122-691a05dd14cf\") " pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:42:57 crc kubenswrapper[4594]: I1129 05:42:57.284270 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f40ab58-b977-4f3d-a122-691a05dd14cf-catalog-content\") pod \"community-operators-zwhtm\" (UID: \"0f40ab58-b977-4f3d-a122-691a05dd14cf\") " pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:42:57 crc kubenswrapper[4594]: I1129 05:42:57.284309 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8h27\" (UniqueName: \"kubernetes.io/projected/0f40ab58-b977-4f3d-a122-691a05dd14cf-kube-api-access-b8h27\") pod \"community-operators-zwhtm\" (UID: \"0f40ab58-b977-4f3d-a122-691a05dd14cf\") " pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:42:57 crc kubenswrapper[4594]: I1129 05:42:57.284413 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f40ab58-b977-4f3d-a122-691a05dd14cf-utilities\") pod \"community-operators-zwhtm\" (UID: \"0f40ab58-b977-4f3d-a122-691a05dd14cf\") " pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:42:57 crc kubenswrapper[4594]: I1129 05:42:57.284684 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f40ab58-b977-4f3d-a122-691a05dd14cf-catalog-content\") pod \"community-operators-zwhtm\" (UID: \"0f40ab58-b977-4f3d-a122-691a05dd14cf\") " pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:42:57 crc kubenswrapper[4594]: I1129 05:42:57.305019 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8h27\" (UniqueName: \"kubernetes.io/projected/0f40ab58-b977-4f3d-a122-691a05dd14cf-kube-api-access-b8h27\") pod \"community-operators-zwhtm\" (UID: \"0f40ab58-b977-4f3d-a122-691a05dd14cf\") " pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:42:57 crc kubenswrapper[4594]: I1129 05:42:57.362031 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:43:00 crc kubenswrapper[4594]: E1129 05:43:00.566469 4594 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Nov 29 05:43:00 crc kubenswrapper[4594]: E1129 05:43:00.566946 4594 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Nov 29 05:43:00 crc kubenswrapper[4594]: E1129 05:43:00.567191 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9nzr8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-545d49fd5c-7c8l9_openstack(887a7316-5262-4580-afea-bfeafc172e70): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 05:43:00 crc kubenswrapper[4594]: E1129 05:43:00.572352 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" podUID="887a7316-5262-4580-afea-bfeafc172e70" Nov 29 05:43:00 crc kubenswrapper[4594]: E1129 05:43:00.577580 4594 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Nov 29 05:43:00 crc kubenswrapper[4594]: E1129 05:43:00.577627 4594 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Nov 29 05:43:00 crc kubenswrapper[4594]: E1129 05:43:00.577778 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wbbsd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(1ff736d8-8719-402e-95c9-1d790c1dff5e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 05:43:00 crc kubenswrapper[4594]: E1129 05:43:00.579001 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="1ff736d8-8719-402e-95c9-1d790c1dff5e" Nov 29 05:43:00 crc kubenswrapper[4594]: E1129 05:43:00.823028 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current\\\"\"" pod="openstack/rabbitmq-server-0" podUID="1ff736d8-8719-402e-95c9-1d790c1dff5e" Nov 29 05:43:01 crc kubenswrapper[4594]: E1129 05:43:01.113511 4594 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Nov 29 05:43:01 crc kubenswrapper[4594]: E1129 05:43:01.113791 4594 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Nov 29 05:43:01 crc kubenswrapper[4594]: E1129 05:43:01.113933 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmfr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-b9b4959cc-rkqq4_openstack(80f43bd8-bee7-4801-a4c7-fd10bec175a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 05:43:01 crc kubenswrapper[4594]: E1129 05:43:01.115154 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" podUID="80f43bd8-bee7-4801-a4c7-fd10bec175a2" Nov 29 05:43:01 crc kubenswrapper[4594]: I1129 05:43:01.376022 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dc92h"] Nov 29 05:43:01 crc kubenswrapper[4594]: I1129 05:43:01.378862 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:01 crc kubenswrapper[4594]: I1129 05:43:01.389090 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dc92h"] Nov 29 05:43:01 crc kubenswrapper[4594]: I1129 05:43:01.579228 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gdld\" (UniqueName: \"kubernetes.io/projected/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-kube-api-access-9gdld\") pod \"redhat-operators-dc92h\" (UID: \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\") " pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:01 crc kubenswrapper[4594]: I1129 05:43:01.579462 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-catalog-content\") pod \"redhat-operators-dc92h\" (UID: \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\") " pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:01 crc kubenswrapper[4594]: I1129 05:43:01.579559 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-utilities\") pod \"redhat-operators-dc92h\" (UID: \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\") " pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:01 crc kubenswrapper[4594]: I1129 05:43:01.683151 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gdld\" (UniqueName: \"kubernetes.io/projected/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-kube-api-access-9gdld\") pod \"redhat-operators-dc92h\" (UID: \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\") " pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:01 crc kubenswrapper[4594]: I1129 05:43:01.683271 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-catalog-content\") pod \"redhat-operators-dc92h\" (UID: \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\") " pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:01 crc kubenswrapper[4594]: I1129 05:43:01.683309 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-utilities\") pod \"redhat-operators-dc92h\" (UID: \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\") " pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:01 crc kubenswrapper[4594]: I1129 05:43:01.683956 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-utilities\") pod \"redhat-operators-dc92h\" (UID: \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\") " pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:01 crc kubenswrapper[4594]: I1129 05:43:01.684050 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-catalog-content\") pod \"redhat-operators-dc92h\" (UID: \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\") " pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:01 crc kubenswrapper[4594]: I1129 05:43:01.712332 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gdld\" (UniqueName: \"kubernetes.io/projected/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-kube-api-access-9gdld\") pod \"redhat-operators-dc92h\" (UID: \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\") " pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:02 crc kubenswrapper[4594]: I1129 05:43:02.007151 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:03 crc kubenswrapper[4594]: E1129 05:43:03.640600 4594 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-mariadb:current" Nov 29 05:43:03 crc kubenswrapper[4594]: E1129 05:43:03.640874 4594 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-mariadb:current" Nov 29 05:43:03 crc kubenswrapper[4594]: E1129 05:43:03.641084 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.rdoproject.org/podified-master-centos10/openstack-mariadb:current,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qmbz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(1ab0ecb8-fc35-4934-b62a-6912d56e9001): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 05:43:03 crc kubenswrapper[4594]: E1129 05:43:03.642419 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="1ab0ecb8-fc35-4934-b62a-6912d56e9001" Nov 29 05:43:03 crc kubenswrapper[4594]: E1129 05:43:03.657088 4594 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Nov 29 05:43:03 crc kubenswrapper[4594]: E1129 05:43:03.657136 4594 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Nov 29 05:43:03 crc kubenswrapper[4594]: E1129 05:43:03.657274 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lrvn9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86b8f4ff9-lspz6_openstack(c2662fc9-873c-4a27-b335-a1c9ad1e8be1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 05:43:03 crc kubenswrapper[4594]: E1129 05:43:03.658567 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" podUID="c2662fc9-873c-4a27-b335-a1c9ad1e8be1" Nov 29 05:43:03 crc kubenswrapper[4594]: E1129 05:43:03.672990 4594 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Nov 29 05:43:03 crc kubenswrapper[4594]: E1129 05:43:03.673021 4594 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Nov 29 05:43:03 crc kubenswrapper[4594]: E1129 05:43:03.673098 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c7h56dh5cfh8bh54fhbbhf4h5b9hdch67fhd7h55fh55fh6ch9h548h54ch665h647h6h8fhd6h5dfh5cdh58bh577h66fh695h5fbh55h77h5fcq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9nzrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5449989c59-lxq8g_openstack(a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 05:43:03 crc kubenswrapper[4594]: E1129 05:43:03.675641 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5449989c59-lxq8g" podUID="a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c" Nov 29 05:43:03 crc kubenswrapper[4594]: E1129 05:43:03.853021 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-mariadb:current\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="1ab0ecb8-fc35-4934-b62a-6912d56e9001" Nov 29 05:43:04 crc kubenswrapper[4594]: I1129 05:43:04.169050 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2k6rk"] Nov 29 05:43:04 crc kubenswrapper[4594]: I1129 05:43:04.171029 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:04 crc kubenswrapper[4594]: I1129 05:43:04.176676 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2k6rk"] Nov 29 05:43:04 crc kubenswrapper[4594]: I1129 05:43:04.343645 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg5nf\" (UniqueName: \"kubernetes.io/projected/90eb3e92-56cd-49d2-934d-9d2e62eb6537-kube-api-access-rg5nf\") pod \"certified-operators-2k6rk\" (UID: \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\") " pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:04 crc kubenswrapper[4594]: I1129 05:43:04.343698 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90eb3e92-56cd-49d2-934d-9d2e62eb6537-utilities\") pod \"certified-operators-2k6rk\" (UID: \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\") " pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:04 crc kubenswrapper[4594]: I1129 05:43:04.343852 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90eb3e92-56cd-49d2-934d-9d2e62eb6537-catalog-content\") pod \"certified-operators-2k6rk\" (UID: \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\") " pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:04 crc kubenswrapper[4594]: I1129 05:43:04.447097 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg5nf\" (UniqueName: \"kubernetes.io/projected/90eb3e92-56cd-49d2-934d-9d2e62eb6537-kube-api-access-rg5nf\") pod \"certified-operators-2k6rk\" (UID: \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\") " pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:04 crc kubenswrapper[4594]: I1129 05:43:04.447142 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90eb3e92-56cd-49d2-934d-9d2e62eb6537-utilities\") pod \"certified-operators-2k6rk\" (UID: \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\") " pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:04 crc kubenswrapper[4594]: I1129 05:43:04.447359 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90eb3e92-56cd-49d2-934d-9d2e62eb6537-catalog-content\") pod \"certified-operators-2k6rk\" (UID: \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\") " pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:04 crc kubenswrapper[4594]: I1129 05:43:04.447660 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90eb3e92-56cd-49d2-934d-9d2e62eb6537-utilities\") pod \"certified-operators-2k6rk\" (UID: \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\") " pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:04 crc kubenswrapper[4594]: I1129 05:43:04.447793 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90eb3e92-56cd-49d2-934d-9d2e62eb6537-catalog-content\") pod \"certified-operators-2k6rk\" (UID: \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\") " pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:04 crc kubenswrapper[4594]: I1129 05:43:04.475632 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg5nf\" (UniqueName: \"kubernetes.io/projected/90eb3e92-56cd-49d2-934d-9d2e62eb6537-kube-api-access-rg5nf\") pod \"certified-operators-2k6rk\" (UID: \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\") " pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:04 crc kubenswrapper[4594]: I1129 05:43:04.488178 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:05 crc kubenswrapper[4594]: E1129 05:43:05.752840 4594 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Nov 29 05:43:05 crc kubenswrapper[4594]: E1129 05:43:05.753101 4594 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Nov 29 05:43:05 crc kubenswrapper[4594]: E1129 05:43:05.753236 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7t5dc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8468885bfc-h4dvz_openstack(fb896c84-cf45-4033-b81c-456b230c2334): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 05:43:05 crc kubenswrapper[4594]: E1129 05:43:05.754440 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-8468885bfc-h4dvz" podUID="fb896c84-cf45-4033-b81c-456b230c2334" Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.790364 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.796564 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.865109 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" event={"ID":"80f43bd8-bee7-4801-a4c7-fd10bec175a2","Type":"ContainerDied","Data":"05cab3c3ed5a193f065295ab810be9812266fc55ad00719a6c07852d91ca6e9f"} Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.865177 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9b4959cc-rkqq4" Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.867389 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" event={"ID":"887a7316-5262-4580-afea-bfeafc172e70","Type":"ContainerDied","Data":"a18978850745007e59c6422483d800ecd80381d890bc4e4c1a3e7e5328d086a3"} Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.867955 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-7c8l9" Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.972277 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887a7316-5262-4580-afea-bfeafc172e70-config\") pod \"887a7316-5262-4580-afea-bfeafc172e70\" (UID: \"887a7316-5262-4580-afea-bfeafc172e70\") " Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.972359 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nzr8\" (UniqueName: \"kubernetes.io/projected/887a7316-5262-4580-afea-bfeafc172e70-kube-api-access-9nzr8\") pod \"887a7316-5262-4580-afea-bfeafc172e70\" (UID: \"887a7316-5262-4580-afea-bfeafc172e70\") " Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.972434 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmfr2\" (UniqueName: \"kubernetes.io/projected/80f43bd8-bee7-4801-a4c7-fd10bec175a2-kube-api-access-nmfr2\") pod \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\" (UID: \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\") " Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.972549 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f43bd8-bee7-4801-a4c7-fd10bec175a2-config\") pod \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\" (UID: \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\") " Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.972583 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887a7316-5262-4580-afea-bfeafc172e70-dns-svc\") pod \"887a7316-5262-4580-afea-bfeafc172e70\" (UID: \"887a7316-5262-4580-afea-bfeafc172e70\") " Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.972667 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80f43bd8-bee7-4801-a4c7-fd10bec175a2-dns-svc\") pod \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\" (UID: \"80f43bd8-bee7-4801-a4c7-fd10bec175a2\") " Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.973612 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887a7316-5262-4580-afea-bfeafc172e70-config" (OuterVolumeSpecName: "config") pod "887a7316-5262-4580-afea-bfeafc172e70" (UID: "887a7316-5262-4580-afea-bfeafc172e70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.973671 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80f43bd8-bee7-4801-a4c7-fd10bec175a2-config" (OuterVolumeSpecName: "config") pod "80f43bd8-bee7-4801-a4c7-fd10bec175a2" (UID: "80f43bd8-bee7-4801-a4c7-fd10bec175a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.973704 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80f43bd8-bee7-4801-a4c7-fd10bec175a2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "80f43bd8-bee7-4801-a4c7-fd10bec175a2" (UID: "80f43bd8-bee7-4801-a4c7-fd10bec175a2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.974130 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887a7316-5262-4580-afea-bfeafc172e70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "887a7316-5262-4580-afea-bfeafc172e70" (UID: "887a7316-5262-4580-afea-bfeafc172e70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.974144 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f43bd8-bee7-4801-a4c7-fd10bec175a2-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.974163 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80f43bd8-bee7-4801-a4c7-fd10bec175a2-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.974171 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887a7316-5262-4580-afea-bfeafc172e70-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.978604 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f43bd8-bee7-4801-a4c7-fd10bec175a2-kube-api-access-nmfr2" (OuterVolumeSpecName: "kube-api-access-nmfr2") pod "80f43bd8-bee7-4801-a4c7-fd10bec175a2" (UID: "80f43bd8-bee7-4801-a4c7-fd10bec175a2"). InnerVolumeSpecName "kube-api-access-nmfr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:05 crc kubenswrapper[4594]: I1129 05:43:05.980374 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/887a7316-5262-4580-afea-bfeafc172e70-kube-api-access-9nzr8" (OuterVolumeSpecName: "kube-api-access-9nzr8") pod "887a7316-5262-4580-afea-bfeafc172e70" (UID: "887a7316-5262-4580-afea-bfeafc172e70"). InnerVolumeSpecName "kube-api-access-9nzr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:05 crc kubenswrapper[4594]: E1129 05:43:05.992711 4594 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current" Nov 29 05:43:05 crc kubenswrapper[4594]: E1129 05:43:05.992753 4594 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current" Nov 29 05:43:05 crc kubenswrapper[4594]: E1129 05:43:05.992893 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n558h6dh8fh5b9h56fh598h579hf9h56h57chdch584h589h79h67h55dh598hbdh5fch58fh77h665hd6h8h56h57bh5ch674h587h5c6h5ddhd6q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-svm6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-b9h9k_openstack(a7baec31-60ce-4be4-8901-a8cbe7bf7ea9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 05:43:05 crc kubenswrapper[4594]: E1129 05:43:05.994912 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-b9h9k" podUID="a7baec31-60ce-4be4-8901-a8cbe7bf7ea9" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.039632 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5449989c59-lxq8g" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.046682 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.085245 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nzr8\" (UniqueName: \"kubernetes.io/projected/887a7316-5262-4580-afea-bfeafc172e70-kube-api-access-9nzr8\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.085551 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmfr2\" (UniqueName: \"kubernetes.io/projected/80f43bd8-bee7-4801-a4c7-fd10bec175a2-kube-api-access-nmfr2\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.085573 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887a7316-5262-4580-afea-bfeafc172e70-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.185996 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-config\") pod \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\" (UID: \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\") " Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.186078 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nzrw\" (UniqueName: \"kubernetes.io/projected/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-kube-api-access-9nzrw\") pod \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\" (UID: \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\") " Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.186155 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrvn9\" (UniqueName: \"kubernetes.io/projected/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-kube-api-access-lrvn9\") pod \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\" (UID: \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\") " Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.186239 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-dns-svc\") pod \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\" (UID: \"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c\") " Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.186283 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-config\") pod \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\" (UID: \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\") " Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.186308 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-dns-svc\") pod \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\" (UID: \"c2662fc9-873c-4a27-b335-a1c9ad1e8be1\") " Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.186562 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-config" (OuterVolumeSpecName: "config") pod "a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c" (UID: "a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.186817 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.187051 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c" (UID: "a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.187551 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c2662fc9-873c-4a27-b335-a1c9ad1e8be1" (UID: "c2662fc9-873c-4a27-b335-a1c9ad1e8be1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.187634 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-config" (OuterVolumeSpecName: "config") pod "c2662fc9-873c-4a27-b335-a1c9ad1e8be1" (UID: "c2662fc9-873c-4a27-b335-a1c9ad1e8be1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.190499 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-kube-api-access-lrvn9" (OuterVolumeSpecName: "kube-api-access-lrvn9") pod "c2662fc9-873c-4a27-b335-a1c9ad1e8be1" (UID: "c2662fc9-873c-4a27-b335-a1c9ad1e8be1"). InnerVolumeSpecName "kube-api-access-lrvn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.190567 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-kube-api-access-9nzrw" (OuterVolumeSpecName: "kube-api-access-9nzrw") pod "a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c" (UID: "a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c"). InnerVolumeSpecName "kube-api-access-9nzrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.286904 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-7c8l9"] Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.288355 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nzrw\" (UniqueName: \"kubernetes.io/projected/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-kube-api-access-9nzrw\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.288372 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrvn9\" (UniqueName: \"kubernetes.io/projected/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-kube-api-access-lrvn9\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.288382 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.288391 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.288400 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2662fc9-873c-4a27-b335-a1c9ad1e8be1-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.296617 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-7c8l9"] Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.320293 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-rkqq4"] Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.330872 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-rkqq4"] Nov 29 05:43:06 crc kubenswrapper[4594]: E1129 05:43:06.498300 4594 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current" Nov 29 05:43:06 crc kubenswrapper[4594]: E1129 05:43:06.498368 4594 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current" Nov 29 05:43:06 crc kubenswrapper[4594]: E1129 05:43:06.498525 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n649h79h696h578h684h5ffh546h54h5bch688h685h6h54ch57h5cbhfch586hdfh5b9h576hf4h5fch5b5h65dh6bh5d5h6dh595h7bhc9hf7hd7q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6hs7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(31fad155-3970-4a5d-a357-e96fa27bbb54): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.509936 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-h4dvz" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.695282 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb896c84-cf45-4033-b81c-456b230c2334-config\") pod \"fb896c84-cf45-4033-b81c-456b230c2334\" (UID: \"fb896c84-cf45-4033-b81c-456b230c2334\") " Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.695646 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb896c84-cf45-4033-b81c-456b230c2334-config" (OuterVolumeSpecName: "config") pod "fb896c84-cf45-4033-b81c-456b230c2334" (UID: "fb896c84-cf45-4033-b81c-456b230c2334"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.695736 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t5dc\" (UniqueName: \"kubernetes.io/projected/fb896c84-cf45-4033-b81c-456b230c2334-kube-api-access-7t5dc\") pod \"fb896c84-cf45-4033-b81c-456b230c2334\" (UID: \"fb896c84-cf45-4033-b81c-456b230c2334\") " Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.696896 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb896c84-cf45-4033-b81c-456b230c2334-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.701740 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb896c84-cf45-4033-b81c-456b230c2334-kube-api-access-7t5dc" (OuterVolumeSpecName: "kube-api-access-7t5dc") pod "fb896c84-cf45-4033-b81c-456b230c2334" (UID: "fb896c84-cf45-4033-b81c-456b230c2334"). InnerVolumeSpecName "kube-api-access-7t5dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.799325 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t5dc\" (UniqueName: \"kubernetes.io/projected/fb896c84-cf45-4033-b81c-456b230c2334-kube-api-access-7t5dc\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.875426 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5449989c59-lxq8g" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.875446 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5449989c59-lxq8g" event={"ID":"a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c","Type":"ContainerDied","Data":"5f2bc09c6221ef405dadcf20802ee288451e2059ec2bbfe85ad0a4be49591298"} Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.876847 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8468885bfc-h4dvz" event={"ID":"fb896c84-cf45-4033-b81c-456b230c2334","Type":"ContainerDied","Data":"46b06bf2829752512a4ec61f26d52426a52573082941e6da594c97ee276d9f88"} Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.876870 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-h4dvz" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.878461 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"9952b1d2-2cce-45f4-b370-5d0107f80260","Type":"ContainerStarted","Data":"465c3a314422661e84e270bc5e2030f4a16743cef535978493e3a11ec6cea284"} Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.879265 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.883737 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4e4fe89a-3363-4724-89a8-a9b61fe6f039","Type":"ContainerStarted","Data":"6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a"} Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.885877 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.886000 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b8f4ff9-lspz6" event={"ID":"c2662fc9-873c-4a27-b335-a1c9ad1e8be1","Type":"ContainerDied","Data":"ef0439ce2ac95e1240ded9eca93f9bfe0fe0808e42018a12d645f8fc199948b3"} Nov 29 05:43:06 crc kubenswrapper[4594]: E1129 05:43:06.887720 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current\\\"\"" pod="openstack/ovn-controller-b9h9k" podUID="a7baec31-60ce-4be4-8901-a8cbe7bf7ea9" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.897205 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.528354047 podStartE2EDuration="44.897193079s" podCreationTimestamp="2025-11-29 05:42:22 +0000 UTC" firstStartedPulling="2025-11-29 05:42:22.861643228 +0000 UTC m=+867.102152449" lastFinishedPulling="2025-11-29 05:43:06.230482251 +0000 UTC m=+910.470991481" observedRunningTime="2025-11-29 05:43:06.893666682 +0000 UTC m=+911.134175902" watchObservedRunningTime="2025-11-29 05:43:06.897193079 +0000 UTC m=+911.137702299" Nov 29 05:43:06 crc kubenswrapper[4594]: I1129 05:43:06.936814 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwhtm"] Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.010792 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2k6rk"] Nov 29 05:43:07 crc kubenswrapper[4594]: E1129 05:43:07.033899 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="31fad155-3970-4a5d-a357-e96fa27bbb54" Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.034144 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dc92h"] Nov 29 05:43:07 crc kubenswrapper[4594]: W1129 05:43:07.055735 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2f69c3e_673d_4b8d_bb61_f1e02758f43b.slice/crio-a81c589154a85cf6de7a6ff183e4d64692f6f24323a075e5e4c900494f228eaa WatchSource:0}: Error finding container a81c589154a85cf6de7a6ff183e4d64692f6f24323a075e5e4c900494f228eaa: Status 404 returned error can't find the container with id a81c589154a85cf6de7a6ff183e4d64692f6f24323a075e5e4c900494f228eaa Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.176979 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-h4dvz"] Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.206293 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-h4dvz"] Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.225196 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-lspz6"] Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.229129 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-lspz6"] Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.238860 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-lxq8g"] Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.242215 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-lxq8g"] Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.898720 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"31fad155-3970-4a5d-a357-e96fa27bbb54","Type":"ContainerStarted","Data":"e392adf05abf7d4cc4330d0e3d9ec7457b34051eff34dafa958ea0a4478c2a39"} Nov 29 05:43:07 crc kubenswrapper[4594]: E1129 05:43:07.900391 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="31fad155-3970-4a5d-a357-e96fa27bbb54" Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.901428 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4f5075e0-210c-455f-8203-3dde7c7be5eb","Type":"ContainerStarted","Data":"e02b6b95203fd12a4b89ed41e5ab364c5d354bc8844747e8ef5d8b3988f47235"} Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.901471 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4f5075e0-210c-455f-8203-3dde7c7be5eb","Type":"ContainerStarted","Data":"1423ecff9c88794ff61099956bd743dcbdbd5a528d622faae74414fa25bff8fe"} Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.903187 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"caf7dae9-7fe9-4cf5-a5d0-39122397592e","Type":"ContainerStarted","Data":"8cd5b22666fef572ab99857e203811d0aad1284ca9f209c98b01293b98765d05"} Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.904916 4594 generic.go:334] "Generic (PLEG): container finished" podID="90eb3e92-56cd-49d2-934d-9d2e62eb6537" containerID="4ef92d709a169a29feee6d3e6b517098f522bead00649aea962df8ff8ebfe354" exitCode=0 Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.904999 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k6rk" event={"ID":"90eb3e92-56cd-49d2-934d-9d2e62eb6537","Type":"ContainerDied","Data":"4ef92d709a169a29feee6d3e6b517098f522bead00649aea962df8ff8ebfe354"} Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.905041 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k6rk" event={"ID":"90eb3e92-56cd-49d2-934d-9d2e62eb6537","Type":"ContainerStarted","Data":"94bdccc2101eeda1c8d3cd9899cc6bc268f171258669d15270d4eb2c4944c78c"} Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.905990 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwhtm" event={"ID":"0f40ab58-b977-4f3d-a122-691a05dd14cf","Type":"ContainerStarted","Data":"ae25a0dba737b8a61408467a5091c86f36df56ce3564dea6dd123633f01a0cbb"} Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.908484 4594 generic.go:334] "Generic (PLEG): container finished" podID="0b167600-fe30-4499-876c-57685a803c45" containerID="ddedc0470fd78f9e6754619a35c10662c9b497dddba7f3c94700edcfb17e23b8" exitCode=0 Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.908687 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zvkw9" event={"ID":"0b167600-fe30-4499-876c-57685a803c45","Type":"ContainerDied","Data":"ddedc0470fd78f9e6754619a35c10662c9b497dddba7f3c94700edcfb17e23b8"} Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.910055 4594 generic.go:334] "Generic (PLEG): container finished" podID="27bd40a9-3181-42a6-8141-55078612c429" containerID="8c41ab12289688c81766b802c4e623bb692bf3b7d062f81a8429801dcf22e99d" exitCode=0 Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.910105 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" event={"ID":"27bd40a9-3181-42a6-8141-55078612c429","Type":"ContainerDied","Data":"8c41ab12289688c81766b802c4e623bb692bf3b7d062f81a8429801dcf22e99d"} Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.912207 4594 generic.go:334] "Generic (PLEG): container finished" podID="d2f69c3e-673d-4b8d-bb61-f1e02758f43b" containerID="1400cc0048042611c5f01b98608de26c8f1984ca87d5bbb19a72a0e7e63c594d" exitCode=0 Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.912299 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dc92h" event={"ID":"d2f69c3e-673d-4b8d-bb61-f1e02758f43b","Type":"ContainerDied","Data":"1400cc0048042611c5f01b98608de26c8f1984ca87d5bbb19a72a0e7e63c594d"} Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.912371 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dc92h" event={"ID":"d2f69c3e-673d-4b8d-bb61-f1e02758f43b","Type":"ContainerStarted","Data":"a81c589154a85cf6de7a6ff183e4d64692f6f24323a075e5e4c900494f228eaa"} Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.916156 4594 generic.go:334] "Generic (PLEG): container finished" podID="bcfcc5d2-2a01-409c-9479-5c3df1f9a319" containerID="113dd35da0020d1a0340c9f74c2cd4f046817876703d8b16c21f4d90ffc94cd2" exitCode=0 Nov 29 05:43:07 crc kubenswrapper[4594]: I1129 05:43:07.916215 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" event={"ID":"bcfcc5d2-2a01-409c-9479-5c3df1f9a319","Type":"ContainerDied","Data":"113dd35da0020d1a0340c9f74c2cd4f046817876703d8b16c21f4d90ffc94cd2"} Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.000725 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.313018509 podStartE2EDuration="38.00069195s" podCreationTimestamp="2025-11-29 05:42:30 +0000 UTC" firstStartedPulling="2025-11-29 05:42:32.909064513 +0000 UTC m=+877.149573732" lastFinishedPulling="2025-11-29 05:43:06.596737953 +0000 UTC m=+910.837247173" observedRunningTime="2025-11-29 05:43:07.991170531 +0000 UTC m=+912.231679751" watchObservedRunningTime="2025-11-29 05:43:08.00069195 +0000 UTC m=+912.241201170" Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.173275 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f43bd8-bee7-4801-a4c7-fd10bec175a2" path="/var/lib/kubelet/pods/80f43bd8-bee7-4801-a4c7-fd10bec175a2/volumes" Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.173851 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="887a7316-5262-4580-afea-bfeafc172e70" path="/var/lib/kubelet/pods/887a7316-5262-4580-afea-bfeafc172e70/volumes" Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.174314 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c" path="/var/lib/kubelet/pods/a27c925c-13c4-44c2-9ca6-c4ca0c7e2e6c/volumes" Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.174693 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2662fc9-873c-4a27-b335-a1c9ad1e8be1" path="/var/lib/kubelet/pods/c2662fc9-873c-4a27-b335-a1c9ad1e8be1/volumes" Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.175041 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb896c84-cf45-4033-b81c-456b230c2334" path="/var/lib/kubelet/pods/fb896c84-cf45-4033-b81c-456b230c2334/volumes" Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.389725 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.933323 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"9667e68c-f715-4663-bddb-53c53d3a593d","Type":"ContainerStarted","Data":"7c1bd83b41e6b5f5154d21d1cf867e01c0a6b29abac556440a9d0952903f05fc"} Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.937964 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" event={"ID":"bcfcc5d2-2a01-409c-9479-5c3df1f9a319","Type":"ContainerStarted","Data":"a4da753a3aa2fa04e80e3b1d9ba64e7c1306d1e3cf43e56bfdceb4aa80c88833"} Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.938153 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.940766 4594 generic.go:334] "Generic (PLEG): container finished" podID="0f40ab58-b977-4f3d-a122-691a05dd14cf" containerID="b081875b24f2557ba78bf4ca604cc83ca645e99e38c74ad794067a174bdad3cf" exitCode=0 Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.940842 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwhtm" event={"ID":"0f40ab58-b977-4f3d-a122-691a05dd14cf","Type":"ContainerDied","Data":"b081875b24f2557ba78bf4ca604cc83ca645e99e38c74ad794067a174bdad3cf"} Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.943122 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zvkw9" event={"ID":"0b167600-fe30-4499-876c-57685a803c45","Type":"ContainerStarted","Data":"b246a48d91407181d786b3e728b891708efc7b1323c55c933ce9a13570d996a0"} Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.947149 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" event={"ID":"27bd40a9-3181-42a6-8141-55078612c429","Type":"ContainerStarted","Data":"124645bad6bf2a689530c19f6a92e72b2f0843907104ae2fc3b4af38e76e9473"} Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.949129 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.966417 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"47b6950d-0d97-486d-aaec-2a9eaaf74027","Type":"ContainerStarted","Data":"18588c32fb9b6698f37b98a5ca9f655c67c03d703b3e049a536f58ced9b1e325"} Nov 29 05:43:08 crc kubenswrapper[4594]: E1129 05:43:08.967607 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="31fad155-3970-4a5d-a357-e96fa27bbb54" Nov 29 05:43:08 crc kubenswrapper[4594]: I1129 05:43:08.980872 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" podStartSLOduration=4.029320081 podStartE2EDuration="32.980854212s" podCreationTimestamp="2025-11-29 05:42:36 +0000 UTC" firstStartedPulling="2025-11-29 05:42:37.659555892 +0000 UTC m=+881.900065113" lastFinishedPulling="2025-11-29 05:43:06.611090023 +0000 UTC m=+910.851599244" observedRunningTime="2025-11-29 05:43:08.974569294 +0000 UTC m=+913.215078504" watchObservedRunningTime="2025-11-29 05:43:08.980854212 +0000 UTC m=+913.221363432" Nov 29 05:43:09 crc kubenswrapper[4594]: I1129 05:43:09.000559 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" podStartSLOduration=3.917193856 podStartE2EDuration="38.00053391s" podCreationTimestamp="2025-11-29 05:42:31 +0000 UTC" firstStartedPulling="2025-11-29 05:42:32.389406297 +0000 UTC m=+876.629915517" lastFinishedPulling="2025-11-29 05:43:06.472746351 +0000 UTC m=+910.713255571" observedRunningTime="2025-11-29 05:43:08.999597668 +0000 UTC m=+913.240106888" watchObservedRunningTime="2025-11-29 05:43:09.00053391 +0000 UTC m=+913.241043130" Nov 29 05:43:09 crc kubenswrapper[4594]: I1129 05:43:09.980938 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4e4fe89a-3363-4724-89a8-a9b61fe6f039","Type":"ContainerStarted","Data":"5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a"} Nov 29 05:43:09 crc kubenswrapper[4594]: I1129 05:43:09.983914 4594 generic.go:334] "Generic (PLEG): container finished" podID="d2f69c3e-673d-4b8d-bb61-f1e02758f43b" containerID="2bbc778abfb96d55c368c4fde502e876ef21fc8d0f2f1f2cb0b31f3e4f1d255d" exitCode=0 Nov 29 05:43:09 crc kubenswrapper[4594]: I1129 05:43:09.984012 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dc92h" event={"ID":"d2f69c3e-673d-4b8d-bb61-f1e02758f43b","Type":"ContainerDied","Data":"2bbc778abfb96d55c368c4fde502e876ef21fc8d0f2f1f2cb0b31f3e4f1d255d"} Nov 29 05:43:09 crc kubenswrapper[4594]: I1129 05:43:09.989060 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zvkw9" event={"ID":"0b167600-fe30-4499-876c-57685a803c45","Type":"ContainerStarted","Data":"7f6de573c95e1c4d92b697c1681b0168aebed6b807f0a73b8547fc6382d32237"} Nov 29 05:43:09 crc kubenswrapper[4594]: I1129 05:43:09.989244 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:43:09 crc kubenswrapper[4594]: I1129 05:43:09.989281 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:43:09 crc kubenswrapper[4594]: I1129 05:43:09.991883 4594 generic.go:334] "Generic (PLEG): container finished" podID="90eb3e92-56cd-49d2-934d-9d2e62eb6537" containerID="7d9f810d37fdcbdb37b1914f2b0e671c18525ce164177124e3413b92ddc7352d" exitCode=0 Nov 29 05:43:09 crc kubenswrapper[4594]: I1129 05:43:09.991928 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k6rk" event={"ID":"90eb3e92-56cd-49d2-934d-9d2e62eb6537","Type":"ContainerDied","Data":"7d9f810d37fdcbdb37b1914f2b0e671c18525ce164177124e3413b92ddc7352d"} Nov 29 05:43:10 crc kubenswrapper[4594]: I1129 05:43:10.045375 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-zvkw9" podStartSLOduration=5.224153462 podStartE2EDuration="43.045356873s" podCreationTimestamp="2025-11-29 05:42:27 +0000 UTC" firstStartedPulling="2025-11-29 05:42:28.679500316 +0000 UTC m=+872.920009536" lastFinishedPulling="2025-11-29 05:43:06.500703737 +0000 UTC m=+910.741212947" observedRunningTime="2025-11-29 05:43:10.038826633 +0000 UTC m=+914.279335853" watchObservedRunningTime="2025-11-29 05:43:10.045356873 +0000 UTC m=+914.285866093" Nov 29 05:43:11 crc kubenswrapper[4594]: I1129 05:43:11.001146 4594 generic.go:334] "Generic (PLEG): container finished" podID="caf7dae9-7fe9-4cf5-a5d0-39122397592e" containerID="8cd5b22666fef572ab99857e203811d0aad1284ca9f209c98b01293b98765d05" exitCode=0 Nov 29 05:43:11 crc kubenswrapper[4594]: I1129 05:43:11.001225 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"caf7dae9-7fe9-4cf5-a5d0-39122397592e","Type":"ContainerDied","Data":"8cd5b22666fef572ab99857e203811d0aad1284ca9f209c98b01293b98765d05"} Nov 29 05:43:11 crc kubenswrapper[4594]: I1129 05:43:11.003798 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k6rk" event={"ID":"90eb3e92-56cd-49d2-934d-9d2e62eb6537","Type":"ContainerStarted","Data":"dd048d28cc8d3edf2f251324026e8d77c3ed4a17b77eeb284995bbf2ab9a0bf2"} Nov 29 05:43:11 crc kubenswrapper[4594]: I1129 05:43:11.006019 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dc92h" event={"ID":"d2f69c3e-673d-4b8d-bb61-f1e02758f43b","Type":"ContainerStarted","Data":"aa7c7258a9785a7d4ad95ccb14f108b0183be8324a355e16389bb1cee662f909"} Nov 29 05:43:11 crc kubenswrapper[4594]: I1129 05:43:11.049301 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2k6rk" podStartSLOduration=4.404266936 podStartE2EDuration="7.049288038s" podCreationTimestamp="2025-11-29 05:43:04 +0000 UTC" firstStartedPulling="2025-11-29 05:43:07.905919378 +0000 UTC m=+912.146428598" lastFinishedPulling="2025-11-29 05:43:10.55094048 +0000 UTC m=+914.791449700" observedRunningTime="2025-11-29 05:43:11.045628562 +0000 UTC m=+915.286137781" watchObservedRunningTime="2025-11-29 05:43:11.049288038 +0000 UTC m=+915.289797249" Nov 29 05:43:11 crc kubenswrapper[4594]: I1129 05:43:11.066326 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dc92h" podStartSLOduration=7.318046821 podStartE2EDuration="10.066312428s" podCreationTimestamp="2025-11-29 05:43:01 +0000 UTC" firstStartedPulling="2025-11-29 05:43:07.913294819 +0000 UTC m=+912.153804039" lastFinishedPulling="2025-11-29 05:43:10.661560427 +0000 UTC m=+914.902069646" observedRunningTime="2025-11-29 05:43:11.061235944 +0000 UTC m=+915.301745164" watchObservedRunningTime="2025-11-29 05:43:11.066312428 +0000 UTC m=+915.306821648" Nov 29 05:43:11 crc kubenswrapper[4594]: I1129 05:43:11.439147 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 29 05:43:11 crc kubenswrapper[4594]: I1129 05:43:11.439808 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 29 05:43:11 crc kubenswrapper[4594]: I1129 05:43:11.485958 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 29 05:43:12 crc kubenswrapper[4594]: I1129 05:43:12.008432 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:12 crc kubenswrapper[4594]: I1129 05:43:12.008482 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:12 crc kubenswrapper[4594]: I1129 05:43:12.024198 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"caf7dae9-7fe9-4cf5-a5d0-39122397592e","Type":"ContainerStarted","Data":"6ca3ee454bbd236a3b6dec66f9e0a0732d52ffa07ec0d29591cce1f7ef5bd3b5"} Nov 29 05:43:12 crc kubenswrapper[4594]: I1129 05:43:12.051119 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.135181292 podStartE2EDuration="53.051090838s" podCreationTimestamp="2025-11-29 05:42:19 +0000 UTC" firstStartedPulling="2025-11-29 05:42:21.313779533 +0000 UTC m=+865.554288753" lastFinishedPulling="2025-11-29 05:43:06.229689079 +0000 UTC m=+910.470198299" observedRunningTime="2025-11-29 05:43:12.046483276 +0000 UTC m=+916.286992496" watchObservedRunningTime="2025-11-29 05:43:12.051090838 +0000 UTC m=+916.291600058" Nov 29 05:43:12 crc kubenswrapper[4594]: I1129 05:43:12.452406 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.090275 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dc92h" podUID="d2f69c3e-673d-4b8d-bb61-f1e02758f43b" containerName="registry-server" probeResult="failure" output=< Nov 29 05:43:13 crc kubenswrapper[4594]: timeout: failed to connect service ":50051" within 1s Nov 29 05:43:13 crc kubenswrapper[4594]: > Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.367181 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z4q76"] Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.369059 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.381436 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4q76"] Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.532559 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5t5p\" (UniqueName: \"kubernetes.io/projected/3f256392-25b5-4ead-8ec1-8973b757b86b-kube-api-access-r5t5p\") pod \"redhat-marketplace-z4q76\" (UID: \"3f256392-25b5-4ead-8ec1-8973b757b86b\") " pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.532631 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f256392-25b5-4ead-8ec1-8973b757b86b-catalog-content\") pod \"redhat-marketplace-z4q76\" (UID: \"3f256392-25b5-4ead-8ec1-8973b757b86b\") " pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.532847 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f256392-25b5-4ead-8ec1-8973b757b86b-utilities\") pod \"redhat-marketplace-z4q76\" (UID: \"3f256392-25b5-4ead-8ec1-8973b757b86b\") " pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.634369 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f256392-25b5-4ead-8ec1-8973b757b86b-catalog-content\") pod \"redhat-marketplace-z4q76\" (UID: \"3f256392-25b5-4ead-8ec1-8973b757b86b\") " pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.634469 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f256392-25b5-4ead-8ec1-8973b757b86b-utilities\") pod \"redhat-marketplace-z4q76\" (UID: \"3f256392-25b5-4ead-8ec1-8973b757b86b\") " pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.634573 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5t5p\" (UniqueName: \"kubernetes.io/projected/3f256392-25b5-4ead-8ec1-8973b757b86b-kube-api-access-r5t5p\") pod \"redhat-marketplace-z4q76\" (UID: \"3f256392-25b5-4ead-8ec1-8973b757b86b\") " pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.635030 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f256392-25b5-4ead-8ec1-8973b757b86b-catalog-content\") pod \"redhat-marketplace-z4q76\" (UID: \"3f256392-25b5-4ead-8ec1-8973b757b86b\") " pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.635035 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f256392-25b5-4ead-8ec1-8973b757b86b-utilities\") pod \"redhat-marketplace-z4q76\" (UID: \"3f256392-25b5-4ead-8ec1-8973b757b86b\") " pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.657655 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5t5p\" (UniqueName: \"kubernetes.io/projected/3f256392-25b5-4ead-8ec1-8973b757b86b-kube-api-access-r5t5p\") pod \"redhat-marketplace-z4q76\" (UID: \"3f256392-25b5-4ead-8ec1-8973b757b86b\") " pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:13 crc kubenswrapper[4594]: I1129 05:43:13.693934 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.488736 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.489871 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.562048 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.603910 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fb75c485f-xvpfr"] Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.604142 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" podUID="27bd40a9-3181-42a6-8141-55078612c429" containerName="dnsmasq-dns" containerID="cri-o://124645bad6bf2a689530c19f6a92e72b2f0843907104ae2fc3b4af38e76e9473" gracePeriod=10 Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.606446 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.649461 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-hpsj2"] Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.667550 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.675922 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-hpsj2"] Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.789891 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-ovsdbserver-sb\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.790105 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf2ch\" (UniqueName: \"kubernetes.io/projected/e6036791-b766-491c-807f-cdb3f9616288-kube-api-access-vf2ch\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.790209 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-ovsdbserver-nb\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.790333 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-config\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.790473 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-dns-svc\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.893603 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-dns-svc\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.893696 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-ovsdbserver-sb\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.893737 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf2ch\" (UniqueName: \"kubernetes.io/projected/e6036791-b766-491c-807f-cdb3f9616288-kube-api-access-vf2ch\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.893760 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-ovsdbserver-nb\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.893793 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-config\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.894601 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-dns-svc\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.894638 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-ovsdbserver-sb\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.894716 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-config\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.894984 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-ovsdbserver-nb\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.931079 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf2ch\" (UniqueName: \"kubernetes.io/projected/e6036791-b766-491c-807f-cdb3f9616288-kube-api-access-vf2ch\") pod \"dnsmasq-dns-76f9c4c8bc-hpsj2\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:14 crc kubenswrapper[4594]: I1129 05:43:14.994518 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.061034 4594 generic.go:334] "Generic (PLEG): container finished" podID="27bd40a9-3181-42a6-8141-55078612c429" containerID="124645bad6bf2a689530c19f6a92e72b2f0843907104ae2fc3b4af38e76e9473" exitCode=0 Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.061118 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" event={"ID":"27bd40a9-3181-42a6-8141-55078612c429","Type":"ContainerDied","Data":"124645bad6bf2a689530c19f6a92e72b2f0843907104ae2fc3b4af38e76e9473"} Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.103045 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.757586 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2k6rk"] Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.788694 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.793866 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.796204 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.796223 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.796299 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-l8xlm" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.797010 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.810830 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pcfh\" (UniqueName: \"kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-kube-api-access-7pcfh\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.810918 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/15a27bc9-a74a-4123-b693-baf16a0ed04d-cache\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.810970 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.811029 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.811096 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/15a27bc9-a74a-4123-b693-baf16a0ed04d-lock\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.811655 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.912301 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pcfh\" (UniqueName: \"kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-kube-api-access-7pcfh\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.912398 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/15a27bc9-a74a-4123-b693-baf16a0ed04d-cache\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.912458 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.912571 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.912637 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/15a27bc9-a74a-4123-b693-baf16a0ed04d-lock\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: E1129 05:43:15.912819 4594 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 05:43:15 crc kubenswrapper[4594]: E1129 05:43:15.912861 4594 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.913154 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/15a27bc9-a74a-4123-b693-baf16a0ed04d-cache\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.913277 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/15a27bc9-a74a-4123-b693-baf16a0ed04d-lock\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.913286 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: E1129 05:43:15.913413 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift podName:15a27bc9-a74a-4123-b693-baf16a0ed04d nodeName:}" failed. No retries permitted until 2025-11-29 05:43:16.412971274 +0000 UTC m=+920.653480495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift") pod "swift-storage-0" (UID: "15a27bc9-a74a-4123-b693-baf16a0ed04d") : configmap "swift-ring-files" not found Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.933436 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pcfh\" (UniqueName: \"kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-kube-api-access-7pcfh\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:15 crc kubenswrapper[4594]: I1129 05:43:15.936502 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.275792 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-jlpg4"] Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.277638 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.279525 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.279645 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.281201 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.301315 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-jlpg4"] Nov 29 05:43:16 crc kubenswrapper[4594]: E1129 05:43:16.302210 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-fcl7t ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-fcl7t ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-jlpg4" podUID="1404738e-3715-4577-89dc-ba555987fe49" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.305473 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-wtghb"] Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.306889 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.331581 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wtghb"] Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.349825 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-jlpg4"] Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.421401 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcl7t\" (UniqueName: \"kubernetes.io/projected/1404738e-3715-4577-89dc-ba555987fe49-kube-api-access-fcl7t\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.421478 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drqp9\" (UniqueName: \"kubernetes.io/projected/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-kube-api-access-drqp9\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.421548 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1404738e-3715-4577-89dc-ba555987fe49-scripts\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.421583 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-ring-data-devices\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.421740 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-dispersionconf\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.421846 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-combined-ca-bundle\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.422084 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1404738e-3715-4577-89dc-ba555987fe49-etc-swift\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.422114 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-swiftconf\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.422144 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1404738e-3715-4577-89dc-ba555987fe49-ring-data-devices\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.422326 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-etc-swift\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.422412 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-combined-ca-bundle\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.422496 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.422552 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-scripts\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.422604 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-swiftconf\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.422674 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-dispersionconf\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: E1129 05:43:16.422741 4594 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 05:43:16 crc kubenswrapper[4594]: E1129 05:43:16.422771 4594 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 05:43:16 crc kubenswrapper[4594]: E1129 05:43:16.422854 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift podName:15a27bc9-a74a-4123-b693-baf16a0ed04d nodeName:}" failed. No retries permitted until 2025-11-29 05:43:17.422829816 +0000 UTC m=+921.663339036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift") pod "swift-storage-0" (UID: "15a27bc9-a74a-4123-b693-baf16a0ed04d") : configmap "swift-ring-files" not found Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.524518 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1404738e-3715-4577-89dc-ba555987fe49-etc-swift\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.524569 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-swiftconf\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.524603 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1404738e-3715-4577-89dc-ba555987fe49-ring-data-devices\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.524641 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-etc-swift\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.524679 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-combined-ca-bundle\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.524728 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-scripts\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.524760 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-swiftconf\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.524794 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-dispersionconf\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.524983 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcl7t\" (UniqueName: \"kubernetes.io/projected/1404738e-3715-4577-89dc-ba555987fe49-kube-api-access-fcl7t\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.525319 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-etc-swift\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.525713 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1404738e-3715-4577-89dc-ba555987fe49-etc-swift\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.525750 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drqp9\" (UniqueName: \"kubernetes.io/projected/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-kube-api-access-drqp9\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.526089 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1404738e-3715-4577-89dc-ba555987fe49-scripts\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.526144 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-ring-data-devices\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.526181 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-dispersionconf\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.526235 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-combined-ca-bundle\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.527011 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-scripts\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.527085 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-ring-data-devices\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.527708 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1404738e-3715-4577-89dc-ba555987fe49-ring-data-devices\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.528761 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1404738e-3715-4577-89dc-ba555987fe49-scripts\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.529526 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-swiftconf\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.530206 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-dispersionconf\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.530811 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-swiftconf\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.531579 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-dispersionconf\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.531746 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-combined-ca-bundle\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.533929 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-combined-ca-bundle\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.540814 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcl7t\" (UniqueName: \"kubernetes.io/projected/1404738e-3715-4577-89dc-ba555987fe49-kube-api-access-fcl7t\") pod \"swift-ring-rebalance-jlpg4\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.541172 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drqp9\" (UniqueName: \"kubernetes.io/projected/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-kube-api-access-drqp9\") pod \"swift-ring-rebalance-wtghb\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.617740 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:16 crc kubenswrapper[4594]: I1129 05:43:16.852655 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" podUID="27bd40a9-3181-42a6-8141-55078612c429" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.077751 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.077896 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2k6rk" podUID="90eb3e92-56cd-49d2-934d-9d2e62eb6537" containerName="registry-server" containerID="cri-o://dd048d28cc8d3edf2f251324026e8d77c3ed4a17b77eeb284995bbf2ab9a0bf2" gracePeriod=2 Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.091887 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.228458 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.237212 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcl7t\" (UniqueName: \"kubernetes.io/projected/1404738e-3715-4577-89dc-ba555987fe49-kube-api-access-fcl7t\") pod \"1404738e-3715-4577-89dc-ba555987fe49\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.237268 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-combined-ca-bundle\") pod \"1404738e-3715-4577-89dc-ba555987fe49\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.237368 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-dispersionconf\") pod \"1404738e-3715-4577-89dc-ba555987fe49\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.237408 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1404738e-3715-4577-89dc-ba555987fe49-etc-swift\") pod \"1404738e-3715-4577-89dc-ba555987fe49\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.237463 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1404738e-3715-4577-89dc-ba555987fe49-ring-data-devices\") pod \"1404738e-3715-4577-89dc-ba555987fe49\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.237509 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-swiftconf\") pod \"1404738e-3715-4577-89dc-ba555987fe49\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.237650 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1404738e-3715-4577-89dc-ba555987fe49-scripts\") pod \"1404738e-3715-4577-89dc-ba555987fe49\" (UID: \"1404738e-3715-4577-89dc-ba555987fe49\") " Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.237781 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1404738e-3715-4577-89dc-ba555987fe49-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1404738e-3715-4577-89dc-ba555987fe49" (UID: "1404738e-3715-4577-89dc-ba555987fe49"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.237956 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1404738e-3715-4577-89dc-ba555987fe49-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1404738e-3715-4577-89dc-ba555987fe49" (UID: "1404738e-3715-4577-89dc-ba555987fe49"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.238083 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1404738e-3715-4577-89dc-ba555987fe49-scripts" (OuterVolumeSpecName: "scripts") pod "1404738e-3715-4577-89dc-ba555987fe49" (UID: "1404738e-3715-4577-89dc-ba555987fe49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.238341 4594 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1404738e-3715-4577-89dc-ba555987fe49-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.238363 4594 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1404738e-3715-4577-89dc-ba555987fe49-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.238379 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1404738e-3715-4577-89dc-ba555987fe49-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.245374 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1404738e-3715-4577-89dc-ba555987fe49" (UID: "1404738e-3715-4577-89dc-ba555987fe49"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.245901 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1404738e-3715-4577-89dc-ba555987fe49" (UID: "1404738e-3715-4577-89dc-ba555987fe49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.247308 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1404738e-3715-4577-89dc-ba555987fe49-kube-api-access-fcl7t" (OuterVolumeSpecName: "kube-api-access-fcl7t") pod "1404738e-3715-4577-89dc-ba555987fe49" (UID: "1404738e-3715-4577-89dc-ba555987fe49"). InnerVolumeSpecName "kube-api-access-fcl7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.267349 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1404738e-3715-4577-89dc-ba555987fe49" (UID: "1404738e-3715-4577-89dc-ba555987fe49"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.340674 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcl7t\" (UniqueName: \"kubernetes.io/projected/1404738e-3715-4577-89dc-ba555987fe49-kube-api-access-fcl7t\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.340722 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.340731 4594 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.340740 4594 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1404738e-3715-4577-89dc-ba555987fe49-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:17 crc kubenswrapper[4594]: I1129 05:43:17.442331 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:17 crc kubenswrapper[4594]: E1129 05:43:17.442717 4594 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 05:43:17 crc kubenswrapper[4594]: E1129 05:43:17.442755 4594 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 05:43:17 crc kubenswrapper[4594]: E1129 05:43:17.442811 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift podName:15a27bc9-a74a-4123-b693-baf16a0ed04d nodeName:}" failed. No retries permitted until 2025-11-29 05:43:19.442792814 +0000 UTC m=+923.683302034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift") pod "swift-storage-0" (UID: "15a27bc9-a74a-4123-b693-baf16a0ed04d") : configmap "swift-ring-files" not found Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.093990 4594 generic.go:334] "Generic (PLEG): container finished" podID="90eb3e92-56cd-49d2-934d-9d2e62eb6537" containerID="dd048d28cc8d3edf2f251324026e8d77c3ed4a17b77eeb284995bbf2ab9a0bf2" exitCode=0 Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.094238 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jlpg4" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.101958 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k6rk" event={"ID":"90eb3e92-56cd-49d2-934d-9d2e62eb6537","Type":"ContainerDied","Data":"dd048d28cc8d3edf2f251324026e8d77c3ed4a17b77eeb284995bbf2ab9a0bf2"} Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.181390 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-jlpg4"] Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.198485 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-jlpg4"] Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.505804 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.520121 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.599383 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-hpsj2"] Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.605811 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wtghb"] Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.620496 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4q76"] Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.674111 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg5nf\" (UniqueName: \"kubernetes.io/projected/90eb3e92-56cd-49d2-934d-9d2e62eb6537-kube-api-access-rg5nf\") pod \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\" (UID: \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\") " Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.674202 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24qt6\" (UniqueName: \"kubernetes.io/projected/27bd40a9-3181-42a6-8141-55078612c429-kube-api-access-24qt6\") pod \"27bd40a9-3181-42a6-8141-55078612c429\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.674307 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-config\") pod \"27bd40a9-3181-42a6-8141-55078612c429\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.674359 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90eb3e92-56cd-49d2-934d-9d2e62eb6537-utilities\") pod \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\" (UID: \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\") " Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.674464 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-dns-svc\") pod \"27bd40a9-3181-42a6-8141-55078612c429\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.674492 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90eb3e92-56cd-49d2-934d-9d2e62eb6537-catalog-content\") pod \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\" (UID: \"90eb3e92-56cd-49d2-934d-9d2e62eb6537\") " Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.674521 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-ovsdbserver-nb\") pod \"27bd40a9-3181-42a6-8141-55078612c429\" (UID: \"27bd40a9-3181-42a6-8141-55078612c429\") " Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.677574 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90eb3e92-56cd-49d2-934d-9d2e62eb6537-utilities" (OuterVolumeSpecName: "utilities") pod "90eb3e92-56cd-49d2-934d-9d2e62eb6537" (UID: "90eb3e92-56cd-49d2-934d-9d2e62eb6537"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.679098 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90eb3e92-56cd-49d2-934d-9d2e62eb6537-kube-api-access-rg5nf" (OuterVolumeSpecName: "kube-api-access-rg5nf") pod "90eb3e92-56cd-49d2-934d-9d2e62eb6537" (UID: "90eb3e92-56cd-49d2-934d-9d2e62eb6537"). InnerVolumeSpecName "kube-api-access-rg5nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.679316 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27bd40a9-3181-42a6-8141-55078612c429-kube-api-access-24qt6" (OuterVolumeSpecName: "kube-api-access-24qt6") pod "27bd40a9-3181-42a6-8141-55078612c429" (UID: "27bd40a9-3181-42a6-8141-55078612c429"). InnerVolumeSpecName "kube-api-access-24qt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.715097 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "27bd40a9-3181-42a6-8141-55078612c429" (UID: "27bd40a9-3181-42a6-8141-55078612c429"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.719924 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-config" (OuterVolumeSpecName: "config") pod "27bd40a9-3181-42a6-8141-55078612c429" (UID: "27bd40a9-3181-42a6-8141-55078612c429"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.723984 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90eb3e92-56cd-49d2-934d-9d2e62eb6537-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90eb3e92-56cd-49d2-934d-9d2e62eb6537" (UID: "90eb3e92-56cd-49d2-934d-9d2e62eb6537"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.733737 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "27bd40a9-3181-42a6-8141-55078612c429" (UID: "27bd40a9-3181-42a6-8141-55078612c429"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.777322 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90eb3e92-56cd-49d2-934d-9d2e62eb6537-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.777355 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.777365 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90eb3e92-56cd-49d2-934d-9d2e62eb6537-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.777378 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.777390 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg5nf\" (UniqueName: \"kubernetes.io/projected/90eb3e92-56cd-49d2-934d-9d2e62eb6537-kube-api-access-rg5nf\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.777399 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24qt6\" (UniqueName: \"kubernetes.io/projected/27bd40a9-3181-42a6-8141-55078612c429-kube-api-access-24qt6\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:18 crc kubenswrapper[4594]: I1129 05:43:18.777408 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27bd40a9-3181-42a6-8141-55078612c429-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.107459 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" event={"ID":"27bd40a9-3181-42a6-8141-55078612c429","Type":"ContainerDied","Data":"2e9faab66b8856e22c01b346ed52ad4084e1b7a0a8b0cccd85b27c66e9268449"} Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.107492 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fb75c485f-xvpfr" Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.107818 4594 scope.go:117] "RemoveContainer" containerID="124645bad6bf2a689530c19f6a92e72b2f0843907104ae2fc3b4af38e76e9473" Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.109361 4594 generic.go:334] "Generic (PLEG): container finished" podID="e6036791-b766-491c-807f-cdb3f9616288" containerID="a435ac8d473ab7df525e6f724350839ba52ac1cb6a472b0f52c11182468a6985" exitCode=0 Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.109409 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" event={"ID":"e6036791-b766-491c-807f-cdb3f9616288","Type":"ContainerDied","Data":"a435ac8d473ab7df525e6f724350839ba52ac1cb6a472b0f52c11182468a6985"} Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.109424 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" event={"ID":"e6036791-b766-491c-807f-cdb3f9616288","Type":"ContainerStarted","Data":"6af2187caa36a4b9ff3b01b2a3535ace323f7cfc86ca9fe73d3f926329c276e6"} Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.118391 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4e4fe89a-3363-4724-89a8-a9b61fe6f039","Type":"ContainerStarted","Data":"a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430"} Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.123558 4594 generic.go:334] "Generic (PLEG): container finished" podID="0f40ab58-b977-4f3d-a122-691a05dd14cf" containerID="4417d6f8cc464a2dde368fceb8d5825095c6e7dd207568017107c130fd998799" exitCode=0 Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.123617 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwhtm" event={"ID":"0f40ab58-b977-4f3d-a122-691a05dd14cf","Type":"ContainerDied","Data":"4417d6f8cc464a2dde368fceb8d5825095c6e7dd207568017107c130fd998799"} Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.127047 4594 generic.go:334] "Generic (PLEG): container finished" podID="3f256392-25b5-4ead-8ec1-8973b757b86b" containerID="6b1a86eca226be1885e50285e0517144d8ad36cc8dc2787b3d7b86c25c715c00" exitCode=0 Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.127146 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4q76" event={"ID":"3f256392-25b5-4ead-8ec1-8973b757b86b","Type":"ContainerDied","Data":"6b1a86eca226be1885e50285e0517144d8ad36cc8dc2787b3d7b86c25c715c00"} Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.127172 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4q76" event={"ID":"3f256392-25b5-4ead-8ec1-8973b757b86b","Type":"ContainerStarted","Data":"216971c001115cb4f8e9a2456daba7d508d19ed91bc0499eef3080f713d22e6e"} Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.130581 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k6rk" event={"ID":"90eb3e92-56cd-49d2-934d-9d2e62eb6537","Type":"ContainerDied","Data":"94bdccc2101eeda1c8d3cd9899cc6bc268f171258669d15270d4eb2c4944c78c"} Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.130664 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2k6rk" Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.137960 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wtghb" event={"ID":"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae","Type":"ContainerStarted","Data":"649cbd3d1b8b7b0568a202886c02dc95a86206f3412bd2d390c8081a136114d7"} Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.138694 4594 scope.go:117] "RemoveContainer" containerID="8c41ab12289688c81766b802c4e623bb692bf3b7d062f81a8429801dcf22e99d" Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.141188 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1ab0ecb8-fc35-4934-b62a-6912d56e9001","Type":"ContainerStarted","Data":"74d68e86ab86a8cf0fe4e8f6a84a025411484415ac3b052c1444b427d818754c"} Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.157563 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fb75c485f-xvpfr"] Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.163133 4594 scope.go:117] "RemoveContainer" containerID="dd048d28cc8d3edf2f251324026e8d77c3ed4a17b77eeb284995bbf2ab9a0bf2" Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.167472 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fb75c485f-xvpfr"] Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.168520 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=5.32946595 podStartE2EDuration="55.168501482s" podCreationTimestamp="2025-11-29 05:42:24 +0000 UTC" firstStartedPulling="2025-11-29 05:42:28.282960419 +0000 UTC m=+872.523469640" lastFinishedPulling="2025-11-29 05:43:18.121995952 +0000 UTC m=+922.362505172" observedRunningTime="2025-11-29 05:43:19.16650265 +0000 UTC m=+923.407011870" watchObservedRunningTime="2025-11-29 05:43:19.168501482 +0000 UTC m=+923.409010702" Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.192696 4594 scope.go:117] "RemoveContainer" containerID="7d9f810d37fdcbdb37b1914f2b0e671c18525ce164177124e3413b92ddc7352d" Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.217171 4594 scope.go:117] "RemoveContainer" containerID="4ef92d709a169a29feee6d3e6b517098f522bead00649aea962df8ff8ebfe354" Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.221929 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2k6rk"] Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.239739 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2k6rk"] Nov 29 05:43:19 crc kubenswrapper[4594]: I1129 05:43:19.493763 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:19 crc kubenswrapper[4594]: E1129 05:43:19.493961 4594 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 05:43:19 crc kubenswrapper[4594]: E1129 05:43:19.493998 4594 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 05:43:19 crc kubenswrapper[4594]: E1129 05:43:19.494068 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift podName:15a27bc9-a74a-4123-b693-baf16a0ed04d nodeName:}" failed. No retries permitted until 2025-11-29 05:43:23.494044352 +0000 UTC m=+927.734553572 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift") pod "swift-storage-0" (UID: "15a27bc9-a74a-4123-b693-baf16a0ed04d") : configmap "swift-ring-files" not found Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.094512 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1404738e-3715-4577-89dc-ba555987fe49" path="/var/lib/kubelet/pods/1404738e-3715-4577-89dc-ba555987fe49/volumes" Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.095126 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27bd40a9-3181-42a6-8141-55078612c429" path="/var/lib/kubelet/pods/27bd40a9-3181-42a6-8141-55078612c429/volumes" Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.095876 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90eb3e92-56cd-49d2-934d-9d2e62eb6537" path="/var/lib/kubelet/pods/90eb3e92-56cd-49d2-934d-9d2e62eb6537/volumes" Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.181734 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b9h9k" event={"ID":"a7baec31-60ce-4be4-8901-a8cbe7bf7ea9","Type":"ContainerStarted","Data":"8fb996f30d47fda187bafa9962d03de31ba5604ca0240ba35b4ce37eb0794599"} Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.182484 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-b9h9k" Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.185452 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" event={"ID":"e6036791-b766-491c-807f-cdb3f9616288","Type":"ContainerStarted","Data":"666b7102786b97c74f0b151f2c001e0f396afe8041aa9ab8a907a15084474eb0"} Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.185799 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.187235 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1ff736d8-8719-402e-95c9-1d790c1dff5e","Type":"ContainerStarted","Data":"3d435468534c16e70f94ac0b2adfa267064e52b83844f59cf3f55c599965f5f9"} Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.196763 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwhtm" event={"ID":"0f40ab58-b977-4f3d-a122-691a05dd14cf","Type":"ContainerStarted","Data":"d6d5b0495122c81e21e6a5beafb23514d6c3a77723831a29fc164acf18301c84"} Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.213791 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-b9h9k" podStartSLOduration=2.781302184 podStartE2EDuration="53.213772108s" podCreationTimestamp="2025-11-29 05:42:27 +0000 UTC" firstStartedPulling="2025-11-29 05:42:28.35949116 +0000 UTC m=+872.600000380" lastFinishedPulling="2025-11-29 05:43:18.791961084 +0000 UTC m=+923.032470304" observedRunningTime="2025-11-29 05:43:20.20480984 +0000 UTC m=+924.445319060" watchObservedRunningTime="2025-11-29 05:43:20.213772108 +0000 UTC m=+924.454281318" Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.246630 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" podStartSLOduration=6.246615618 podStartE2EDuration="6.246615618s" podCreationTimestamp="2025-11-29 05:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:43:20.226417596 +0000 UTC m=+924.466926817" watchObservedRunningTime="2025-11-29 05:43:20.246615618 +0000 UTC m=+924.487124839" Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.300498 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zwhtm" podStartSLOduration=12.429841705 podStartE2EDuration="23.300476284s" podCreationTimestamp="2025-11-29 05:42:57 +0000 UTC" firstStartedPulling="2025-11-29 05:43:08.942421734 +0000 UTC m=+913.182930953" lastFinishedPulling="2025-11-29 05:43:19.813056312 +0000 UTC m=+924.053565532" observedRunningTime="2025-11-29 05:43:20.261879916 +0000 UTC m=+924.502389136" watchObservedRunningTime="2025-11-29 05:43:20.300476284 +0000 UTC m=+924.540985505" Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.852075 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.862535 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.862595 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 29 05:43:20 crc kubenswrapper[4594]: I1129 05:43:20.960153 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 29 05:43:21 crc kubenswrapper[4594]: I1129 05:43:21.204751 4594 generic.go:334] "Generic (PLEG): container finished" podID="3f256392-25b5-4ead-8ec1-8973b757b86b" containerID="18ba91f387c3fde05ea659104bc61c9a7bcada83017d4efe174b2f1652671c60" exitCode=0 Nov 29 05:43:21 crc kubenswrapper[4594]: I1129 05:43:21.204881 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4q76" event={"ID":"3f256392-25b5-4ead-8ec1-8973b757b86b","Type":"ContainerDied","Data":"18ba91f387c3fde05ea659104bc61c9a7bcada83017d4efe174b2f1652671c60"} Nov 29 05:43:21 crc kubenswrapper[4594]: I1129 05:43:21.318852 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.055881 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.111645 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.130174 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-vcfr2"] Nov 29 05:43:22 crc kubenswrapper[4594]: E1129 05:43:22.130710 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90eb3e92-56cd-49d2-934d-9d2e62eb6537" containerName="extract-content" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.130739 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="90eb3e92-56cd-49d2-934d-9d2e62eb6537" containerName="extract-content" Nov 29 05:43:22 crc kubenswrapper[4594]: E1129 05:43:22.130759 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27bd40a9-3181-42a6-8141-55078612c429" containerName="dnsmasq-dns" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.130766 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bd40a9-3181-42a6-8141-55078612c429" containerName="dnsmasq-dns" Nov 29 05:43:22 crc kubenswrapper[4594]: E1129 05:43:22.130777 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90eb3e92-56cd-49d2-934d-9d2e62eb6537" containerName="extract-utilities" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.130785 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="90eb3e92-56cd-49d2-934d-9d2e62eb6537" containerName="extract-utilities" Nov 29 05:43:22 crc kubenswrapper[4594]: E1129 05:43:22.130801 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90eb3e92-56cd-49d2-934d-9d2e62eb6537" containerName="registry-server" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.130808 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="90eb3e92-56cd-49d2-934d-9d2e62eb6537" containerName="registry-server" Nov 29 05:43:22 crc kubenswrapper[4594]: E1129 05:43:22.130837 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27bd40a9-3181-42a6-8141-55078612c429" containerName="init" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.130844 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bd40a9-3181-42a6-8141-55078612c429" containerName="init" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.131067 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="90eb3e92-56cd-49d2-934d-9d2e62eb6537" containerName="registry-server" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.131121 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="27bd40a9-3181-42a6-8141-55078612c429" containerName="dnsmasq-dns" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.131802 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vcfr2" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.145338 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6fef-account-create-update-8cksk"] Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.146655 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6fef-account-create-update-8cksk" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.149909 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.153613 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vcfr2"] Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.160947 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6fef-account-create-update-8cksk"] Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.214987 4594 generic.go:334] "Generic (PLEG): container finished" podID="1ab0ecb8-fc35-4934-b62a-6912d56e9001" containerID="74d68e86ab86a8cf0fe4e8f6a84a025411484415ac3b052c1444b427d818754c" exitCode=0 Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.215171 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1ab0ecb8-fc35-4934-b62a-6912d56e9001","Type":"ContainerDied","Data":"74d68e86ab86a8cf0fe4e8f6a84a025411484415ac3b052c1444b427d818754c"} Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.253378 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/652d562d-549c-45e1-ad93-325a607548c6-operator-scripts\") pod \"keystone-db-create-vcfr2\" (UID: \"652d562d-549c-45e1-ad93-325a607548c6\") " pod="openstack/keystone-db-create-vcfr2" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.253483 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99hnp\" (UniqueName: \"kubernetes.io/projected/652d562d-549c-45e1-ad93-325a607548c6-kube-api-access-99hnp\") pod \"keystone-db-create-vcfr2\" (UID: \"652d562d-549c-45e1-ad93-325a607548c6\") " pod="openstack/keystone-db-create-vcfr2" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.253530 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd700633-7336-45dc-a346-4740e6605784-operator-scripts\") pod \"keystone-6fef-account-create-update-8cksk\" (UID: \"cd700633-7336-45dc-a346-4740e6605784\") " pod="openstack/keystone-6fef-account-create-update-8cksk" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.253582 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gcrr\" (UniqueName: \"kubernetes.io/projected/cd700633-7336-45dc-a346-4740e6605784-kube-api-access-5gcrr\") pod \"keystone-6fef-account-create-update-8cksk\" (UID: \"cd700633-7336-45dc-a346-4740e6605784\") " pod="openstack/keystone-6fef-account-create-update-8cksk" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.331417 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-8bpck"] Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.332927 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8bpck" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.338239 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8bpck"] Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.361070 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hwwv\" (UniqueName: \"kubernetes.io/projected/8b6efdb5-647a-4e53-b1c7-0b48c0acc137-kube-api-access-5hwwv\") pod \"placement-db-create-8bpck\" (UID: \"8b6efdb5-647a-4e53-b1c7-0b48c0acc137\") " pod="openstack/placement-db-create-8bpck" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.361209 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/652d562d-549c-45e1-ad93-325a607548c6-operator-scripts\") pod \"keystone-db-create-vcfr2\" (UID: \"652d562d-549c-45e1-ad93-325a607548c6\") " pod="openstack/keystone-db-create-vcfr2" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.361396 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99hnp\" (UniqueName: \"kubernetes.io/projected/652d562d-549c-45e1-ad93-325a607548c6-kube-api-access-99hnp\") pod \"keystone-db-create-vcfr2\" (UID: \"652d562d-549c-45e1-ad93-325a607548c6\") " pod="openstack/keystone-db-create-vcfr2" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.361555 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6efdb5-647a-4e53-b1c7-0b48c0acc137-operator-scripts\") pod \"placement-db-create-8bpck\" (UID: \"8b6efdb5-647a-4e53-b1c7-0b48c0acc137\") " pod="openstack/placement-db-create-8bpck" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.361688 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd700633-7336-45dc-a346-4740e6605784-operator-scripts\") pod \"keystone-6fef-account-create-update-8cksk\" (UID: \"cd700633-7336-45dc-a346-4740e6605784\") " pod="openstack/keystone-6fef-account-create-update-8cksk" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.361798 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gcrr\" (UniqueName: \"kubernetes.io/projected/cd700633-7336-45dc-a346-4740e6605784-kube-api-access-5gcrr\") pod \"keystone-6fef-account-create-update-8cksk\" (UID: \"cd700633-7336-45dc-a346-4740e6605784\") " pod="openstack/keystone-6fef-account-create-update-8cksk" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.363076 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/652d562d-549c-45e1-ad93-325a607548c6-operator-scripts\") pod \"keystone-db-create-vcfr2\" (UID: \"652d562d-549c-45e1-ad93-325a607548c6\") " pod="openstack/keystone-db-create-vcfr2" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.363738 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd700633-7336-45dc-a346-4740e6605784-operator-scripts\") pod \"keystone-6fef-account-create-update-8cksk\" (UID: \"cd700633-7336-45dc-a346-4740e6605784\") " pod="openstack/keystone-6fef-account-create-update-8cksk" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.381150 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99hnp\" (UniqueName: \"kubernetes.io/projected/652d562d-549c-45e1-ad93-325a607548c6-kube-api-access-99hnp\") pod \"keystone-db-create-vcfr2\" (UID: \"652d562d-549c-45e1-ad93-325a607548c6\") " pod="openstack/keystone-db-create-vcfr2" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.381238 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gcrr\" (UniqueName: \"kubernetes.io/projected/cd700633-7336-45dc-a346-4740e6605784-kube-api-access-5gcrr\") pod \"keystone-6fef-account-create-update-8cksk\" (UID: \"cd700633-7336-45dc-a346-4740e6605784\") " pod="openstack/keystone-6fef-account-create-update-8cksk" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.440399 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3b14-account-create-update-bk8sl"] Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.441937 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b14-account-create-update-bk8sl" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.444427 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.457201 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3b14-account-create-update-bk8sl"] Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.459888 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vcfr2" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.464004 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6fef-account-create-update-8cksk" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.464758 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hwwv\" (UniqueName: \"kubernetes.io/projected/8b6efdb5-647a-4e53-b1c7-0b48c0acc137-kube-api-access-5hwwv\") pod \"placement-db-create-8bpck\" (UID: \"8b6efdb5-647a-4e53-b1c7-0b48c0acc137\") " pod="openstack/placement-db-create-8bpck" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.464916 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4srrb\" (UniqueName: \"kubernetes.io/projected/72a76725-4bb7-4fa5-98e9-066434887870-kube-api-access-4srrb\") pod \"placement-3b14-account-create-update-bk8sl\" (UID: \"72a76725-4bb7-4fa5-98e9-066434887870\") " pod="openstack/placement-3b14-account-create-update-bk8sl" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.465398 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6efdb5-647a-4e53-b1c7-0b48c0acc137-operator-scripts\") pod \"placement-db-create-8bpck\" (UID: \"8b6efdb5-647a-4e53-b1c7-0b48c0acc137\") " pod="openstack/placement-db-create-8bpck" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.465498 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72a76725-4bb7-4fa5-98e9-066434887870-operator-scripts\") pod \"placement-3b14-account-create-update-bk8sl\" (UID: \"72a76725-4bb7-4fa5-98e9-066434887870\") " pod="openstack/placement-3b14-account-create-update-bk8sl" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.466575 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6efdb5-647a-4e53-b1c7-0b48c0acc137-operator-scripts\") pod \"placement-db-create-8bpck\" (UID: \"8b6efdb5-647a-4e53-b1c7-0b48c0acc137\") " pod="openstack/placement-db-create-8bpck" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.478540 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hwwv\" (UniqueName: \"kubernetes.io/projected/8b6efdb5-647a-4e53-b1c7-0b48c0acc137-kube-api-access-5hwwv\") pod \"placement-db-create-8bpck\" (UID: \"8b6efdb5-647a-4e53-b1c7-0b48c0acc137\") " pod="openstack/placement-db-create-8bpck" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.568494 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4srrb\" (UniqueName: \"kubernetes.io/projected/72a76725-4bb7-4fa5-98e9-066434887870-kube-api-access-4srrb\") pod \"placement-3b14-account-create-update-bk8sl\" (UID: \"72a76725-4bb7-4fa5-98e9-066434887870\") " pod="openstack/placement-3b14-account-create-update-bk8sl" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.568869 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72a76725-4bb7-4fa5-98e9-066434887870-operator-scripts\") pod \"placement-3b14-account-create-update-bk8sl\" (UID: \"72a76725-4bb7-4fa5-98e9-066434887870\") " pod="openstack/placement-3b14-account-create-update-bk8sl" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.569805 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72a76725-4bb7-4fa5-98e9-066434887870-operator-scripts\") pod \"placement-3b14-account-create-update-bk8sl\" (UID: \"72a76725-4bb7-4fa5-98e9-066434887870\") " pod="openstack/placement-3b14-account-create-update-bk8sl" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.588655 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4srrb\" (UniqueName: \"kubernetes.io/projected/72a76725-4bb7-4fa5-98e9-066434887870-kube-api-access-4srrb\") pod \"placement-3b14-account-create-update-bk8sl\" (UID: \"72a76725-4bb7-4fa5-98e9-066434887870\") " pod="openstack/placement-3b14-account-create-update-bk8sl" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.708035 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8bpck" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.760903 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b14-account-create-update-bk8sl" Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.883153 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vcfr2"] Nov 29 05:43:22 crc kubenswrapper[4594]: W1129 05:43:22.903400 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod652d562d_549c_45e1_ad93_325a607548c6.slice/crio-206d9902951a34fd80372687e7214be2e65c0d15d556b9e4ee80c980e28b7893 WatchSource:0}: Error finding container 206d9902951a34fd80372687e7214be2e65c0d15d556b9e4ee80c980e28b7893: Status 404 returned error can't find the container with id 206d9902951a34fd80372687e7214be2e65c0d15d556b9e4ee80c980e28b7893 Nov 29 05:43:22 crc kubenswrapper[4594]: I1129 05:43:22.952665 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6fef-account-create-update-8cksk"] Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.163680 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3b14-account-create-update-bk8sl"] Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.175446 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8bpck"] Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.291757 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1ab0ecb8-fc35-4934-b62a-6912d56e9001","Type":"ContainerStarted","Data":"c6b207bfb7cb9bed18d0cc24347f35f6f643b0d68fe3d21682b7b4f9415eae61"} Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.299203 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"31fad155-3970-4a5d-a357-e96fa27bbb54","Type":"ContainerStarted","Data":"9e63fc4892718ee5b13aed17d919520bf4c006cfca555aa5e0292f9cecd4e9eb"} Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.301281 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vcfr2" event={"ID":"652d562d-549c-45e1-ad93-325a607548c6","Type":"ContainerStarted","Data":"206d9902951a34fd80372687e7214be2e65c0d15d556b9e4ee80c980e28b7893"} Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.302467 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3b14-account-create-update-bk8sl" event={"ID":"72a76725-4bb7-4fa5-98e9-066434887870","Type":"ContainerStarted","Data":"13656d1f3e9024e34428a312ede2d00486ad708798d3be48ddf4409e617a456d"} Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.303794 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6fef-account-create-update-8cksk" event={"ID":"cd700633-7336-45dc-a346-4740e6605784","Type":"ContainerStarted","Data":"69ea0351cfb6058f55856f6391c781c48eeb5d32a591528b616a705616e67c61"} Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.311841 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4q76" event={"ID":"3f256392-25b5-4ead-8ec1-8973b757b86b","Type":"ContainerStarted","Data":"6270eb6d93036d48cf1d38e80a5444a536fc0315b3782ae2ccddc15310197438"} Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.318969 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wtghb" event={"ID":"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae","Type":"ContainerStarted","Data":"7e0230e41b5936cf69830230f2b1b8d114fe4b3b43c56335758c7e6305e211a1"} Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.322834 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8bpck" event={"ID":"8b6efdb5-647a-4e53-b1c7-0b48c0acc137","Type":"ContainerStarted","Data":"b970d21e5e6338722f6f1cb7f862954d125440cf75061faaf9eef2aa6d01b4a4"} Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.327971 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371973.526817 podStartE2EDuration="1m3.327958516s" podCreationTimestamp="2025-11-29 05:42:20 +0000 UTC" firstStartedPulling="2025-11-29 05:42:22.632420852 +0000 UTC m=+866.872930072" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:43:23.31400218 +0000 UTC m=+927.554511400" watchObservedRunningTime="2025-11-29 05:43:23.327958516 +0000 UTC m=+927.568467736" Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.341934 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.795178071 podStartE2EDuration="55.341923938s" podCreationTimestamp="2025-11-29 05:42:28 +0000 UTC" firstStartedPulling="2025-11-29 05:42:30.661303684 +0000 UTC m=+874.901812904" lastFinishedPulling="2025-11-29 05:43:22.208049551 +0000 UTC m=+926.448558771" observedRunningTime="2025-11-29 05:43:23.33852973 +0000 UTC m=+927.579038951" watchObservedRunningTime="2025-11-29 05:43:23.341923938 +0000 UTC m=+927.582433158" Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.364539 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z4q76" podStartSLOduration=6.80943704 podStartE2EDuration="10.364531004s" podCreationTimestamp="2025-11-29 05:43:13 +0000 UTC" firstStartedPulling="2025-11-29 05:43:19.139145223 +0000 UTC m=+923.379654444" lastFinishedPulling="2025-11-29 05:43:22.694239188 +0000 UTC m=+926.934748408" observedRunningTime="2025-11-29 05:43:23.360711697 +0000 UTC m=+927.601220918" watchObservedRunningTime="2025-11-29 05:43:23.364531004 +0000 UTC m=+927.605040225" Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.384744 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-wtghb" podStartSLOduration=3.77715254 podStartE2EDuration="7.384733927s" podCreationTimestamp="2025-11-29 05:43:16 +0000 UTC" firstStartedPulling="2025-11-29 05:43:18.605790171 +0000 UTC m=+922.846299391" lastFinishedPulling="2025-11-29 05:43:22.213371557 +0000 UTC m=+926.453880778" observedRunningTime="2025-11-29 05:43:23.383594322 +0000 UTC m=+927.624103542" watchObservedRunningTime="2025-11-29 05:43:23.384733927 +0000 UTC m=+927.625243137" Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.595042 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:23 crc kubenswrapper[4594]: E1129 05:43:23.595323 4594 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 05:43:23 crc kubenswrapper[4594]: E1129 05:43:23.595497 4594 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 05:43:23 crc kubenswrapper[4594]: E1129 05:43:23.595578 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift podName:15a27bc9-a74a-4123-b693-baf16a0ed04d nodeName:}" failed. No retries permitted until 2025-11-29 05:43:31.595557386 +0000 UTC m=+935.836066606 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift") pod "swift-storage-0" (UID: "15a27bc9-a74a-4123-b693-baf16a0ed04d") : configmap "swift-ring-files" not found Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.695090 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:23 crc kubenswrapper[4594]: I1129 05:43:23.695131 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.092477 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.341823 4594 generic.go:334] "Generic (PLEG): container finished" podID="652d562d-549c-45e1-ad93-325a607548c6" containerID="285a1205f8b1dcbd2ae81b0654c0e36a710c713a1f7696b81fe9bba05eae03d2" exitCode=0 Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.341879 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vcfr2" event={"ID":"652d562d-549c-45e1-ad93-325a607548c6","Type":"ContainerDied","Data":"285a1205f8b1dcbd2ae81b0654c0e36a710c713a1f7696b81fe9bba05eae03d2"} Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.343804 4594 generic.go:334] "Generic (PLEG): container finished" podID="72a76725-4bb7-4fa5-98e9-066434887870" containerID="a4b2825001eb7b9000e1433443f55cfa355772df65f6d82a1930b3b8c85779ee" exitCode=0 Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.343902 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3b14-account-create-update-bk8sl" event={"ID":"72a76725-4bb7-4fa5-98e9-066434887870","Type":"ContainerDied","Data":"a4b2825001eb7b9000e1433443f55cfa355772df65f6d82a1930b3b8c85779ee"} Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.345377 4594 generic.go:334] "Generic (PLEG): container finished" podID="cd700633-7336-45dc-a346-4740e6605784" containerID="0baf7d4a765c1131d79936e097efbb8d4f9b563431f42950c46c7b5e6cf3c452" exitCode=0 Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.345497 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6fef-account-create-update-8cksk" event={"ID":"cd700633-7336-45dc-a346-4740e6605784","Type":"ContainerDied","Data":"0baf7d4a765c1131d79936e097efbb8d4f9b563431f42950c46c7b5e6cf3c452"} Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.347231 4594 generic.go:334] "Generic (PLEG): container finished" podID="8b6efdb5-647a-4e53-b1c7-0b48c0acc137" containerID="6f1a81abf856d1f61c9191f30fa0254a7b7fe26265d4c275d8b539d11a915435" exitCode=0 Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.347277 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8bpck" event={"ID":"8b6efdb5-647a-4e53-b1c7-0b48c0acc137","Type":"ContainerDied","Data":"6f1a81abf856d1f61c9191f30fa0254a7b7fe26265d4c275d8b539d11a915435"} Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.621661 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-8bxs9"] Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.623200 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-8bxs9" Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.631419 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-8bxs9"] Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.729175 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-55dd-account-create-update-5b9fv"] Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.731982 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-55dd-account-create-update-5b9fv" Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.732305 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-z4q76" podUID="3f256392-25b5-4ead-8ec1-8973b757b86b" containerName="registry-server" probeResult="failure" output=< Nov 29 05:43:24 crc kubenswrapper[4594]: timeout: failed to connect service ":50051" within 1s Nov 29 05:43:24 crc kubenswrapper[4594]: > Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.734028 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.738983 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-55dd-account-create-update-5b9fv"] Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.822628 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trr5c\" (UniqueName: \"kubernetes.io/projected/e37693f9-769d-409e-a90a-946d5fbe00d9-kube-api-access-trr5c\") pod \"watcher-db-create-8bxs9\" (UID: \"e37693f9-769d-409e-a90a-946d5fbe00d9\") " pod="openstack/watcher-db-create-8bxs9" Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.822679 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e37693f9-769d-409e-a90a-946d5fbe00d9-operator-scripts\") pod \"watcher-db-create-8bxs9\" (UID: \"e37693f9-769d-409e-a90a-946d5fbe00d9\") " pod="openstack/watcher-db-create-8bxs9" Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.925291 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b1647cf-8657-466a-8579-2ce1f5ac45a3-operator-scripts\") pod \"watcher-55dd-account-create-update-5b9fv\" (UID: \"9b1647cf-8657-466a-8579-2ce1f5ac45a3\") " pod="openstack/watcher-55dd-account-create-update-5b9fv" Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.925396 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trr5c\" (UniqueName: \"kubernetes.io/projected/e37693f9-769d-409e-a90a-946d5fbe00d9-kube-api-access-trr5c\") pod \"watcher-db-create-8bxs9\" (UID: \"e37693f9-769d-409e-a90a-946d5fbe00d9\") " pod="openstack/watcher-db-create-8bxs9" Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.925755 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e37693f9-769d-409e-a90a-946d5fbe00d9-operator-scripts\") pod \"watcher-db-create-8bxs9\" (UID: \"e37693f9-769d-409e-a90a-946d5fbe00d9\") " pod="openstack/watcher-db-create-8bxs9" Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.925845 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25tlq\" (UniqueName: \"kubernetes.io/projected/9b1647cf-8657-466a-8579-2ce1f5ac45a3-kube-api-access-25tlq\") pod \"watcher-55dd-account-create-update-5b9fv\" (UID: \"9b1647cf-8657-466a-8579-2ce1f5ac45a3\") " pod="openstack/watcher-55dd-account-create-update-5b9fv" Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.926246 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e37693f9-769d-409e-a90a-946d5fbe00d9-operator-scripts\") pod \"watcher-db-create-8bxs9\" (UID: \"e37693f9-769d-409e-a90a-946d5fbe00d9\") " pod="openstack/watcher-db-create-8bxs9" Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.948931 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trr5c\" (UniqueName: \"kubernetes.io/projected/e37693f9-769d-409e-a90a-946d5fbe00d9-kube-api-access-trr5c\") pod \"watcher-db-create-8bxs9\" (UID: \"e37693f9-769d-409e-a90a-946d5fbe00d9\") " pod="openstack/watcher-db-create-8bxs9" Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.973701 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dc92h"] Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.974281 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dc92h" podUID="d2f69c3e-673d-4b8d-bb61-f1e02758f43b" containerName="registry-server" containerID="cri-o://aa7c7258a9785a7d4ad95ccb14f108b0183be8324a355e16389bb1cee662f909" gracePeriod=2 Nov 29 05:43:24 crc kubenswrapper[4594]: I1129 05:43:24.996905 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.027595 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b1647cf-8657-466a-8579-2ce1f5ac45a3-operator-scripts\") pod \"watcher-55dd-account-create-update-5b9fv\" (UID: \"9b1647cf-8657-466a-8579-2ce1f5ac45a3\") " pod="openstack/watcher-55dd-account-create-update-5b9fv" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.027704 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25tlq\" (UniqueName: \"kubernetes.io/projected/9b1647cf-8657-466a-8579-2ce1f5ac45a3-kube-api-access-25tlq\") pod \"watcher-55dd-account-create-update-5b9fv\" (UID: \"9b1647cf-8657-466a-8579-2ce1f5ac45a3\") " pod="openstack/watcher-55dd-account-create-update-5b9fv" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.028787 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b1647cf-8657-466a-8579-2ce1f5ac45a3-operator-scripts\") pod \"watcher-55dd-account-create-update-5b9fv\" (UID: \"9b1647cf-8657-466a-8579-2ce1f5ac45a3\") " pod="openstack/watcher-55dd-account-create-update-5b9fv" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.049475 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25tlq\" (UniqueName: \"kubernetes.io/projected/9b1647cf-8657-466a-8579-2ce1f5ac45a3-kube-api-access-25tlq\") pod \"watcher-55dd-account-create-update-5b9fv\" (UID: \"9b1647cf-8657-466a-8579-2ce1f5ac45a3\") " pod="openstack/watcher-55dd-account-create-update-5b9fv" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.049915 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-55dd-account-create-update-5b9fv" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.071131 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dbf544cc9-vr7qn"] Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.071395 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" podUID="bcfcc5d2-2a01-409c-9479-5c3df1f9a319" containerName="dnsmasq-dns" containerID="cri-o://a4da753a3aa2fa04e80e3b1d9ba64e7c1306d1e3cf43e56bfdceb4aa80c88833" gracePeriod=10 Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.086156 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.238423 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-8bxs9" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.364899 4594 generic.go:334] "Generic (PLEG): container finished" podID="d2f69c3e-673d-4b8d-bb61-f1e02758f43b" containerID="aa7c7258a9785a7d4ad95ccb14f108b0183be8324a355e16389bb1cee662f909" exitCode=0 Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.364965 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dc92h" event={"ID":"d2f69c3e-673d-4b8d-bb61-f1e02758f43b","Type":"ContainerDied","Data":"aa7c7258a9785a7d4ad95ccb14f108b0183be8324a355e16389bb1cee662f909"} Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.364994 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dc92h" event={"ID":"d2f69c3e-673d-4b8d-bb61-f1e02758f43b","Type":"ContainerDied","Data":"a81c589154a85cf6de7a6ff183e4d64692f6f24323a075e5e4c900494f228eaa"} Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.365007 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a81c589154a85cf6de7a6ff183e4d64692f6f24323a075e5e4c900494f228eaa" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.365111 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.367418 4594 generic.go:334] "Generic (PLEG): container finished" podID="bcfcc5d2-2a01-409c-9479-5c3df1f9a319" containerID="a4da753a3aa2fa04e80e3b1d9ba64e7c1306d1e3cf43e56bfdceb4aa80c88833" exitCode=0 Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.367540 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" event={"ID":"bcfcc5d2-2a01-409c-9479-5c3df1f9a319","Type":"ContainerDied","Data":"a4da753a3aa2fa04e80e3b1d9ba64e7c1306d1e3cf43e56bfdceb4aa80c88833"} Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.536766 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gdld\" (UniqueName: \"kubernetes.io/projected/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-kube-api-access-9gdld\") pod \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\" (UID: \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\") " Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.536833 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-utilities\") pod \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\" (UID: \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\") " Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.537238 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-catalog-content\") pod \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\" (UID: \"d2f69c3e-673d-4b8d-bb61-f1e02758f43b\") " Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.538008 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-utilities" (OuterVolumeSpecName: "utilities") pod "d2f69c3e-673d-4b8d-bb61-f1e02758f43b" (UID: "d2f69c3e-673d-4b8d-bb61-f1e02758f43b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.541352 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.543403 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-kube-api-access-9gdld" (OuterVolumeSpecName: "kube-api-access-9gdld") pod "d2f69c3e-673d-4b8d-bb61-f1e02758f43b" (UID: "d2f69c3e-673d-4b8d-bb61-f1e02758f43b"). InnerVolumeSpecName "kube-api-access-9gdld". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.565744 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.566110 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-55dd-account-create-update-5b9fv"] Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.642733 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gdld\" (UniqueName: \"kubernetes.io/projected/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-kube-api-access-9gdld\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.657736 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2f69c3e-673d-4b8d-bb61-f1e02758f43b" (UID: "d2f69c3e-673d-4b8d-bb61-f1e02758f43b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.736894 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b14-account-create-update-bk8sl" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.744309 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgw9h\" (UniqueName: \"kubernetes.io/projected/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-kube-api-access-lgw9h\") pod \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.744385 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-ovsdbserver-sb\") pod \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.744438 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-config\") pod \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.744508 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-dns-svc\") pod \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.744566 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-ovsdbserver-nb\") pod \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\" (UID: \"bcfcc5d2-2a01-409c-9479-5c3df1f9a319\") " Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.745058 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f69c3e-673d-4b8d-bb61-f1e02758f43b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.753382 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-kube-api-access-lgw9h" (OuterVolumeSpecName: "kube-api-access-lgw9h") pod "bcfcc5d2-2a01-409c-9479-5c3df1f9a319" (UID: "bcfcc5d2-2a01-409c-9479-5c3df1f9a319"). InnerVolumeSpecName "kube-api-access-lgw9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.788217 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bcfcc5d2-2a01-409c-9479-5c3df1f9a319" (UID: "bcfcc5d2-2a01-409c-9479-5c3df1f9a319"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.789131 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vcfr2" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.838661 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bcfcc5d2-2a01-409c-9479-5c3df1f9a319" (UID: "bcfcc5d2-2a01-409c-9479-5c3df1f9a319"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.850925 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4srrb\" (UniqueName: \"kubernetes.io/projected/72a76725-4bb7-4fa5-98e9-066434887870-kube-api-access-4srrb\") pod \"72a76725-4bb7-4fa5-98e9-066434887870\" (UID: \"72a76725-4bb7-4fa5-98e9-066434887870\") " Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.851003 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72a76725-4bb7-4fa5-98e9-066434887870-operator-scripts\") pod \"72a76725-4bb7-4fa5-98e9-066434887870\" (UID: \"72a76725-4bb7-4fa5-98e9-066434887870\") " Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.851507 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgw9h\" (UniqueName: \"kubernetes.io/projected/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-kube-api-access-lgw9h\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.851520 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.851530 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.852022 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.852867 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72a76725-4bb7-4fa5-98e9-066434887870-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72a76725-4bb7-4fa5-98e9-066434887870" (UID: "72a76725-4bb7-4fa5-98e9-066434887870"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.857759 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bcfcc5d2-2a01-409c-9479-5c3df1f9a319" (UID: "bcfcc5d2-2a01-409c-9479-5c3df1f9a319"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.857894 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a76725-4bb7-4fa5-98e9-066434887870-kube-api-access-4srrb" (OuterVolumeSpecName: "kube-api-access-4srrb") pod "72a76725-4bb7-4fa5-98e9-066434887870" (UID: "72a76725-4bb7-4fa5-98e9-066434887870"). InnerVolumeSpecName "kube-api-access-4srrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.911207 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.912649 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-config" (OuterVolumeSpecName: "config") pod "bcfcc5d2-2a01-409c-9479-5c3df1f9a319" (UID: "bcfcc5d2-2a01-409c-9479-5c3df1f9a319"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.959823 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99hnp\" (UniqueName: \"kubernetes.io/projected/652d562d-549c-45e1-ad93-325a607548c6-kube-api-access-99hnp\") pod \"652d562d-549c-45e1-ad93-325a607548c6\" (UID: \"652d562d-549c-45e1-ad93-325a607548c6\") " Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.960030 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/652d562d-549c-45e1-ad93-325a607548c6-operator-scripts\") pod \"652d562d-549c-45e1-ad93-325a607548c6\" (UID: \"652d562d-549c-45e1-ad93-325a607548c6\") " Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.962473 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.962492 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4srrb\" (UniqueName: \"kubernetes.io/projected/72a76725-4bb7-4fa5-98e9-066434887870-kube-api-access-4srrb\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.962607 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72a76725-4bb7-4fa5-98e9-066434887870-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.962618 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcfcc5d2-2a01-409c-9479-5c3df1f9a319-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.964056 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/652d562d-549c-45e1-ad93-325a607548c6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "652d562d-549c-45e1-ad93-325a607548c6" (UID: "652d562d-549c-45e1-ad93-325a607548c6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.965500 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6fef-account-create-update-8cksk" Nov 29 05:43:25 crc kubenswrapper[4594]: I1129 05:43:25.978581 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/652d562d-549c-45e1-ad93-325a607548c6-kube-api-access-99hnp" (OuterVolumeSpecName: "kube-api-access-99hnp") pod "652d562d-549c-45e1-ad93-325a607548c6" (UID: "652d562d-549c-45e1-ad93-325a607548c6"). InnerVolumeSpecName "kube-api-access-99hnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.051104 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-8bxs9"] Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.069122 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gcrr\" (UniqueName: \"kubernetes.io/projected/cd700633-7336-45dc-a346-4740e6605784-kube-api-access-5gcrr\") pod \"cd700633-7336-45dc-a346-4740e6605784\" (UID: \"cd700633-7336-45dc-a346-4740e6605784\") " Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.069341 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd700633-7336-45dc-a346-4740e6605784-operator-scripts\") pod \"cd700633-7336-45dc-a346-4740e6605784\" (UID: \"cd700633-7336-45dc-a346-4740e6605784\") " Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.069981 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99hnp\" (UniqueName: \"kubernetes.io/projected/652d562d-549c-45e1-ad93-325a607548c6-kube-api-access-99hnp\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.070001 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/652d562d-549c-45e1-ad93-325a607548c6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.075034 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd700633-7336-45dc-a346-4740e6605784-kube-api-access-5gcrr" (OuterVolumeSpecName: "kube-api-access-5gcrr") pod "cd700633-7336-45dc-a346-4740e6605784" (UID: "cd700633-7336-45dc-a346-4740e6605784"). InnerVolumeSpecName "kube-api-access-5gcrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.079495 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd700633-7336-45dc-a346-4740e6605784-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd700633-7336-45dc-a346-4740e6605784" (UID: "cd700633-7336-45dc-a346-4740e6605784"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.085524 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8bpck" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.196773 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd700633-7336-45dc-a346-4740e6605784-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.197001 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gcrr\" (UniqueName: \"kubernetes.io/projected/cd700633-7336-45dc-a346-4740e6605784-kube-api-access-5gcrr\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.298086 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hwwv\" (UniqueName: \"kubernetes.io/projected/8b6efdb5-647a-4e53-b1c7-0b48c0acc137-kube-api-access-5hwwv\") pod \"8b6efdb5-647a-4e53-b1c7-0b48c0acc137\" (UID: \"8b6efdb5-647a-4e53-b1c7-0b48c0acc137\") " Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.298166 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6efdb5-647a-4e53-b1c7-0b48c0acc137-operator-scripts\") pod \"8b6efdb5-647a-4e53-b1c7-0b48c0acc137\" (UID: \"8b6efdb5-647a-4e53-b1c7-0b48c0acc137\") " Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.299414 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6efdb5-647a-4e53-b1c7-0b48c0acc137-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b6efdb5-647a-4e53-b1c7-0b48c0acc137" (UID: "8b6efdb5-647a-4e53-b1c7-0b48c0acc137"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.303346 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b6efdb5-647a-4e53-b1c7-0b48c0acc137-kube-api-access-5hwwv" (OuterVolumeSpecName: "kube-api-access-5hwwv") pod "8b6efdb5-647a-4e53-b1c7-0b48c0acc137" (UID: "8b6efdb5-647a-4e53-b1c7-0b48c0acc137"). InnerVolumeSpecName "kube-api-access-5hwwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.377900 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" event={"ID":"bcfcc5d2-2a01-409c-9479-5c3df1f9a319","Type":"ContainerDied","Data":"84625ade04012652264528f4462eaae297c7518575296dbe822cef7fbf14a1f8"} Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.377948 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbf544cc9-vr7qn" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.377972 4594 scope.go:117] "RemoveContainer" containerID="a4da753a3aa2fa04e80e3b1d9ba64e7c1306d1e3cf43e56bfdceb4aa80c88833" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.380216 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6fef-account-create-update-8cksk" event={"ID":"cd700633-7336-45dc-a346-4740e6605784","Type":"ContainerDied","Data":"69ea0351cfb6058f55856f6391c781c48eeb5d32a591528b616a705616e67c61"} Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.380271 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69ea0351cfb6058f55856f6391c781c48eeb5d32a591528b616a705616e67c61" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.380336 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6fef-account-create-update-8cksk" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.382337 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8bpck" event={"ID":"8b6efdb5-647a-4e53-b1c7-0b48c0acc137","Type":"ContainerDied","Data":"b970d21e5e6338722f6f1cb7f862954d125440cf75061faaf9eef2aa6d01b4a4"} Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.382376 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b970d21e5e6338722f6f1cb7f862954d125440cf75061faaf9eef2aa6d01b4a4" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.382703 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8bpck" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.384480 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vcfr2" event={"ID":"652d562d-549c-45e1-ad93-325a607548c6","Type":"ContainerDied","Data":"206d9902951a34fd80372687e7214be2e65c0d15d556b9e4ee80c980e28b7893"} Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.384511 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="206d9902951a34fd80372687e7214be2e65c0d15d556b9e4ee80c980e28b7893" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.384570 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vcfr2" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.387363 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3b14-account-create-update-bk8sl" event={"ID":"72a76725-4bb7-4fa5-98e9-066434887870","Type":"ContainerDied","Data":"13656d1f3e9024e34428a312ede2d00486ad708798d3be48ddf4409e617a456d"} Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.387394 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13656d1f3e9024e34428a312ede2d00486ad708798d3be48ddf4409e617a456d" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.387440 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b14-account-create-update-bk8sl" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.395819 4594 generic.go:334] "Generic (PLEG): container finished" podID="9b1647cf-8657-466a-8579-2ce1f5ac45a3" containerID="96f85f3f02d900868dd9211f3d35d1c977c7a18a11beb67169815d70a5d29302" exitCode=0 Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.395909 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-55dd-account-create-update-5b9fv" event={"ID":"9b1647cf-8657-466a-8579-2ce1f5ac45a3","Type":"ContainerDied","Data":"96f85f3f02d900868dd9211f3d35d1c977c7a18a11beb67169815d70a5d29302"} Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.395932 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-55dd-account-create-update-5b9fv" event={"ID":"9b1647cf-8657-466a-8579-2ce1f5ac45a3","Type":"ContainerStarted","Data":"5b6ebb90fcb7ab5c11d8d7e098eee2aaa1830acdfba25e57f252014fd25d5539"} Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.399759 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-8bxs9" event={"ID":"e37693f9-769d-409e-a90a-946d5fbe00d9","Type":"ContainerStarted","Data":"e169ba37e5cb7287590583b06c50678378f4fda5f488bf446171457fc5381dde"} Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.399810 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-8bxs9" event={"ID":"e37693f9-769d-409e-a90a-946d5fbe00d9","Type":"ContainerStarted","Data":"6a2cfd89cc62aea7d931d7371c5d61afd7d64b9b1dc47e6b288bc8de296a54ba"} Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.400455 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dc92h" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.400609 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hwwv\" (UniqueName: \"kubernetes.io/projected/8b6efdb5-647a-4e53-b1c7-0b48c0acc137-kube-api-access-5hwwv\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.400989 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6efdb5-647a-4e53-b1c7-0b48c0acc137-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.401066 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.401441 4594 scope.go:117] "RemoveContainer" containerID="113dd35da0020d1a0340c9f74c2cd4f046817876703d8b16c21f4d90ffc94cd2" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.402720 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dbf544cc9-vr7qn"] Nov 29 05:43:26 crc kubenswrapper[4594]: E1129 05:43:26.405238 4594 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcfcc5d2_2a01_409c_9479_5c3df1f9a319.slice/crio-84625ade04012652264528f4462eaae297c7518575296dbe822cef7fbf14a1f8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72a76725_4bb7_4fa5_98e9_066434887870.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b1647cf_8657_466a_8579_2ce1f5ac45a3.slice/crio-96f85f3f02d900868dd9211f3d35d1c977c7a18a11beb67169815d70a5d29302.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcfcc5d2_2a01_409c_9479_5c3df1f9a319.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72a76725_4bb7_4fa5_98e9_066434887870.slice/crio-13656d1f3e9024e34428a312ede2d00486ad708798d3be48ddf4409e617a456d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2f69c3e_673d_4b8d_bb61_f1e02758f43b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd700633_7336_45dc_a346_4740e6605784.slice/crio-69ea0351cfb6058f55856f6391c781c48eeb5d32a591528b616a705616e67c61\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd700633_7336_45dc_a346_4740e6605784.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b1647cf_8657_466a_8579_2ce1f5ac45a3.slice/crio-conmon-96f85f3f02d900868dd9211f3d35d1c977c7a18a11beb67169815d70a5d29302.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod652d562d_549c_45e1_ad93_325a607548c6.slice\": RecentStats: unable to find data in memory cache]" Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.417025 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dbf544cc9-vr7qn"] Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.431525 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dc92h"] Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.437705 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dc92h"] Nov 29 05:43:26 crc kubenswrapper[4594]: I1129 05:43:26.448137 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-create-8bxs9" podStartSLOduration=2.448117865 podStartE2EDuration="2.448117865s" podCreationTimestamp="2025-11-29 05:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:43:26.440960426 +0000 UTC m=+930.681469646" watchObservedRunningTime="2025-11-29 05:43:26.448117865 +0000 UTC m=+930.688627086" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.122871 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.165658 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.365150 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.365245 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.403489 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.411142 4594 generic.go:334] "Generic (PLEG): container finished" podID="e37693f9-769d-409e-a90a-946d5fbe00d9" containerID="e169ba37e5cb7287590583b06c50678378f4fda5f488bf446171457fc5381dde" exitCode=0 Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.411222 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-8bxs9" event={"ID":"e37693f9-769d-409e-a90a-946d5fbe00d9","Type":"ContainerDied","Data":"e169ba37e5cb7287590583b06c50678378f4fda5f488bf446171457fc5381dde"} Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.469888 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zwhtm" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477025 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 29 05:43:27 crc kubenswrapper[4594]: E1129 05:43:27.477466 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfcc5d2-2a01-409c-9479-5c3df1f9a319" containerName="init" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477485 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfcc5d2-2a01-409c-9479-5c3df1f9a319" containerName="init" Nov 29 05:43:27 crc kubenswrapper[4594]: E1129 05:43:27.477496 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6efdb5-647a-4e53-b1c7-0b48c0acc137" containerName="mariadb-database-create" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477503 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6efdb5-647a-4e53-b1c7-0b48c0acc137" containerName="mariadb-database-create" Nov 29 05:43:27 crc kubenswrapper[4594]: E1129 05:43:27.477522 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a76725-4bb7-4fa5-98e9-066434887870" containerName="mariadb-account-create-update" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477528 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a76725-4bb7-4fa5-98e9-066434887870" containerName="mariadb-account-create-update" Nov 29 05:43:27 crc kubenswrapper[4594]: E1129 05:43:27.477538 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="652d562d-549c-45e1-ad93-325a607548c6" containerName="mariadb-database-create" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477544 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="652d562d-549c-45e1-ad93-325a607548c6" containerName="mariadb-database-create" Nov 29 05:43:27 crc kubenswrapper[4594]: E1129 05:43:27.477575 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f69c3e-673d-4b8d-bb61-f1e02758f43b" containerName="registry-server" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477583 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f69c3e-673d-4b8d-bb61-f1e02758f43b" containerName="registry-server" Nov 29 05:43:27 crc kubenswrapper[4594]: E1129 05:43:27.477592 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f69c3e-673d-4b8d-bb61-f1e02758f43b" containerName="extract-content" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477597 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f69c3e-673d-4b8d-bb61-f1e02758f43b" containerName="extract-content" Nov 29 05:43:27 crc kubenswrapper[4594]: E1129 05:43:27.477613 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f69c3e-673d-4b8d-bb61-f1e02758f43b" containerName="extract-utilities" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477618 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f69c3e-673d-4b8d-bb61-f1e02758f43b" containerName="extract-utilities" Nov 29 05:43:27 crc kubenswrapper[4594]: E1129 05:43:27.477626 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd700633-7336-45dc-a346-4740e6605784" containerName="mariadb-account-create-update" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477634 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd700633-7336-45dc-a346-4740e6605784" containerName="mariadb-account-create-update" Nov 29 05:43:27 crc kubenswrapper[4594]: E1129 05:43:27.477645 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfcc5d2-2a01-409c-9479-5c3df1f9a319" containerName="dnsmasq-dns" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477652 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfcc5d2-2a01-409c-9479-5c3df1f9a319" containerName="dnsmasq-dns" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477847 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd700633-7336-45dc-a346-4740e6605784" containerName="mariadb-account-create-update" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477867 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6efdb5-647a-4e53-b1c7-0b48c0acc137" containerName="mariadb-database-create" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477880 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2f69c3e-673d-4b8d-bb61-f1e02758f43b" containerName="registry-server" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477887 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="652d562d-549c-45e1-ad93-325a607548c6" containerName="mariadb-database-create" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477895 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcfcc5d2-2a01-409c-9479-5c3df1f9a319" containerName="dnsmasq-dns" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.477908 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a76725-4bb7-4fa5-98e9-066434887870" containerName="mariadb-account-create-update" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.478905 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.481486 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.481673 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.483870 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.486110 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.486288 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-68vh7" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.630302 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/21284ba8-9492-4f6e-84c8-88d3844f386b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.630396 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pjcw\" (UniqueName: \"kubernetes.io/projected/21284ba8-9492-4f6e-84c8-88d3844f386b-kube-api-access-4pjcw\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.630469 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21284ba8-9492-4f6e-84c8-88d3844f386b-scripts\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.630489 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21284ba8-9492-4f6e-84c8-88d3844f386b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.630570 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21284ba8-9492-4f6e-84c8-88d3844f386b-config\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.630624 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/21284ba8-9492-4f6e-84c8-88d3844f386b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.630665 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/21284ba8-9492-4f6e-84c8-88d3844f386b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.733066 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21284ba8-9492-4f6e-84c8-88d3844f386b-config\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.733131 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/21284ba8-9492-4f6e-84c8-88d3844f386b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.733165 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/21284ba8-9492-4f6e-84c8-88d3844f386b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.733216 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/21284ba8-9492-4f6e-84c8-88d3844f386b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.733281 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pjcw\" (UniqueName: \"kubernetes.io/projected/21284ba8-9492-4f6e-84c8-88d3844f386b-kube-api-access-4pjcw\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.733350 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21284ba8-9492-4f6e-84c8-88d3844f386b-scripts\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.733366 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21284ba8-9492-4f6e-84c8-88d3844f386b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.733723 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/21284ba8-9492-4f6e-84c8-88d3844f386b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.734323 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21284ba8-9492-4f6e-84c8-88d3844f386b-config\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.734404 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21284ba8-9492-4f6e-84c8-88d3844f386b-scripts\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.749641 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/21284ba8-9492-4f6e-84c8-88d3844f386b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.750684 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/21284ba8-9492-4f6e-84c8-88d3844f386b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.750717 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21284ba8-9492-4f6e-84c8-88d3844f386b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.751950 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pjcw\" (UniqueName: \"kubernetes.io/projected/21284ba8-9492-4f6e-84c8-88d3844f386b-kube-api-access-4pjcw\") pod \"ovn-northd-0\" (UID: \"21284ba8-9492-4f6e-84c8-88d3844f386b\") " pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.794730 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-55dd-account-create-update-5b9fv" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.798820 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.936831 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b1647cf-8657-466a-8579-2ce1f5ac45a3-operator-scripts\") pod \"9b1647cf-8657-466a-8579-2ce1f5ac45a3\" (UID: \"9b1647cf-8657-466a-8579-2ce1f5ac45a3\") " Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.936939 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25tlq\" (UniqueName: \"kubernetes.io/projected/9b1647cf-8657-466a-8579-2ce1f5ac45a3-kube-api-access-25tlq\") pod \"9b1647cf-8657-466a-8579-2ce1f5ac45a3\" (UID: \"9b1647cf-8657-466a-8579-2ce1f5ac45a3\") " Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.937665 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b1647cf-8657-466a-8579-2ce1f5ac45a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b1647cf-8657-466a-8579-2ce1f5ac45a3" (UID: "9b1647cf-8657-466a-8579-2ce1f5ac45a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.937907 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b1647cf-8657-466a-8579-2ce1f5ac45a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:27 crc kubenswrapper[4594]: I1129 05:43:27.942729 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b1647cf-8657-466a-8579-2ce1f5ac45a3-kube-api-access-25tlq" (OuterVolumeSpecName: "kube-api-access-25tlq") pod "9b1647cf-8657-466a-8579-2ce1f5ac45a3" (UID: "9b1647cf-8657-466a-8579-2ce1f5ac45a3"). InnerVolumeSpecName "kube-api-access-25tlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:28 crc kubenswrapper[4594]: I1129 05:43:28.040081 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25tlq\" (UniqueName: \"kubernetes.io/projected/9b1647cf-8657-466a-8579-2ce1f5ac45a3-kube-api-access-25tlq\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:28 crc kubenswrapper[4594]: I1129 05:43:28.092348 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcfcc5d2-2a01-409c-9479-5c3df1f9a319" path="/var/lib/kubelet/pods/bcfcc5d2-2a01-409c-9479-5c3df1f9a319/volumes" Nov 29 05:43:28 crc kubenswrapper[4594]: I1129 05:43:28.092993 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2f69c3e-673d-4b8d-bb61-f1e02758f43b" path="/var/lib/kubelet/pods/d2f69c3e-673d-4b8d-bb61-f1e02758f43b/volumes" Nov 29 05:43:28 crc kubenswrapper[4594]: I1129 05:43:28.238626 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 29 05:43:28 crc kubenswrapper[4594]: W1129 05:43:28.238962 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21284ba8_9492_4f6e_84c8_88d3844f386b.slice/crio-4fe980bf7e66a5241461fcefbba3104b8cf2f7f415ffd5606b1f191a63df3118 WatchSource:0}: Error finding container 4fe980bf7e66a5241461fcefbba3104b8cf2f7f415ffd5606b1f191a63df3118: Status 404 returned error can't find the container with id 4fe980bf7e66a5241461fcefbba3104b8cf2f7f415ffd5606b1f191a63df3118 Nov 29 05:43:28 crc kubenswrapper[4594]: I1129 05:43:28.421599 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-55dd-account-create-update-5b9fv" Nov 29 05:43:28 crc kubenswrapper[4594]: I1129 05:43:28.423340 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-55dd-account-create-update-5b9fv" event={"ID":"9b1647cf-8657-466a-8579-2ce1f5ac45a3","Type":"ContainerDied","Data":"5b6ebb90fcb7ab5c11d8d7e098eee2aaa1830acdfba25e57f252014fd25d5539"} Nov 29 05:43:28 crc kubenswrapper[4594]: I1129 05:43:28.423468 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b6ebb90fcb7ab5c11d8d7e098eee2aaa1830acdfba25e57f252014fd25d5539" Nov 29 05:43:28 crc kubenswrapper[4594]: I1129 05:43:28.425635 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"21284ba8-9492-4f6e-84c8-88d3844f386b","Type":"ContainerStarted","Data":"4fe980bf7e66a5241461fcefbba3104b8cf2f7f415ffd5606b1f191a63df3118"} Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:28.709474 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-8bxs9" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:28.780593 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwhtm"] Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:28.856343 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trr5c\" (UniqueName: \"kubernetes.io/projected/e37693f9-769d-409e-a90a-946d5fbe00d9-kube-api-access-trr5c\") pod \"e37693f9-769d-409e-a90a-946d5fbe00d9\" (UID: \"e37693f9-769d-409e-a90a-946d5fbe00d9\") " Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:28.856811 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e37693f9-769d-409e-a90a-946d5fbe00d9-operator-scripts\") pod \"e37693f9-769d-409e-a90a-946d5fbe00d9\" (UID: \"e37693f9-769d-409e-a90a-946d5fbe00d9\") " Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:28.857499 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37693f9-769d-409e-a90a-946d5fbe00d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e37693f9-769d-409e-a90a-946d5fbe00d9" (UID: "e37693f9-769d-409e-a90a-946d5fbe00d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:28.857745 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e37693f9-769d-409e-a90a-946d5fbe00d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:28.862526 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37693f9-769d-409e-a90a-946d5fbe00d9-kube-api-access-trr5c" (OuterVolumeSpecName: "kube-api-access-trr5c") pod "e37693f9-769d-409e-a90a-946d5fbe00d9" (UID: "e37693f9-769d-409e-a90a-946d5fbe00d9"). InnerVolumeSpecName "kube-api-access-trr5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:28.957373 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n4msp"] Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:28.957618 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n4msp" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" containerName="registry-server" containerID="cri-o://1e2b44cf933e93a3fe1935c83ebfa89fe04f0b8a9dc43cec070f6e4ac910802e" gracePeriod=2 Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:28.959327 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trr5c\" (UniqueName: \"kubernetes.io/projected/e37693f9-769d-409e-a90a-946d5fbe00d9-kube-api-access-trr5c\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.401594 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.449881 4594 generic.go:334] "Generic (PLEG): container finished" podID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" containerID="1e2b44cf933e93a3fe1935c83ebfa89fe04f0b8a9dc43cec070f6e4ac910802e" exitCode=0 Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.450353 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4msp" event={"ID":"398463af-ebf5-4586-8a81-bc9b7cdce9c0","Type":"ContainerDied","Data":"1e2b44cf933e93a3fe1935c83ebfa89fe04f0b8a9dc43cec070f6e4ac910802e"} Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.450385 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4msp" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.450399 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4msp" event={"ID":"398463af-ebf5-4586-8a81-bc9b7cdce9c0","Type":"ContainerDied","Data":"487959b49f6406781968ec910e6ebb90842348f0573b4aa11bdb658f64713f12"} Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.450435 4594 scope.go:117] "RemoveContainer" containerID="1e2b44cf933e93a3fe1935c83ebfa89fe04f0b8a9dc43cec070f6e4ac910802e" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.460155 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-8bxs9" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.460302 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-8bxs9" event={"ID":"e37693f9-769d-409e-a90a-946d5fbe00d9","Type":"ContainerDied","Data":"6a2cfd89cc62aea7d931d7371c5d61afd7d64b9b1dc47e6b288bc8de296a54ba"} Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.460336 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a2cfd89cc62aea7d931d7371c5d61afd7d64b9b1dc47e6b288bc8de296a54ba" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.461006 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.461290 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="prometheus" containerID="cri-o://6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a" gracePeriod=600 Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.461356 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="config-reloader" containerID="cri-o://5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a" gracePeriod=600 Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.461369 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="thanos-sidecar" containerID="cri-o://a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430" gracePeriod=600 Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.469816 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db9gt\" (UniqueName: \"kubernetes.io/projected/398463af-ebf5-4586-8a81-bc9b7cdce9c0-kube-api-access-db9gt\") pod \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\" (UID: \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\") " Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.469940 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398463af-ebf5-4586-8a81-bc9b7cdce9c0-catalog-content\") pod \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\" (UID: \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\") " Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.470153 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398463af-ebf5-4586-8a81-bc9b7cdce9c0-utilities\") pod \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\" (UID: \"398463af-ebf5-4586-8a81-bc9b7cdce9c0\") " Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.472456 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398463af-ebf5-4586-8a81-bc9b7cdce9c0-utilities" (OuterVolumeSpecName: "utilities") pod "398463af-ebf5-4586-8a81-bc9b7cdce9c0" (UID: "398463af-ebf5-4586-8a81-bc9b7cdce9c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.476988 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398463af-ebf5-4586-8a81-bc9b7cdce9c0-kube-api-access-db9gt" (OuterVolumeSpecName: "kube-api-access-db9gt") pod "398463af-ebf5-4586-8a81-bc9b7cdce9c0" (UID: "398463af-ebf5-4586-8a81-bc9b7cdce9c0"). InnerVolumeSpecName "kube-api-access-db9gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.481305 4594 scope.go:117] "RemoveContainer" containerID="63478de88d9dea415d385132bb93a6fb43f25f25f1a0cde8624603e685ca456e" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.519168 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398463af-ebf5-4586-8a81-bc9b7cdce9c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "398463af-ebf5-4586-8a81-bc9b7cdce9c0" (UID: "398463af-ebf5-4586-8a81-bc9b7cdce9c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.573345 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db9gt\" (UniqueName: \"kubernetes.io/projected/398463af-ebf5-4586-8a81-bc9b7cdce9c0-kube-api-access-db9gt\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.573386 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398463af-ebf5-4586-8a81-bc9b7cdce9c0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.573399 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398463af-ebf5-4586-8a81-bc9b7cdce9c0-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.632625 4594 scope.go:117] "RemoveContainer" containerID="e6656096bebb07d98138098c03032af517ed8e4b6dc43d03d7a57dd38c8d61d6" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.686873 4594 scope.go:117] "RemoveContainer" containerID="1e2b44cf933e93a3fe1935c83ebfa89fe04f0b8a9dc43cec070f6e4ac910802e" Nov 29 05:43:29 crc kubenswrapper[4594]: E1129 05:43:29.687441 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e2b44cf933e93a3fe1935c83ebfa89fe04f0b8a9dc43cec070f6e4ac910802e\": container with ID starting with 1e2b44cf933e93a3fe1935c83ebfa89fe04f0b8a9dc43cec070f6e4ac910802e not found: ID does not exist" containerID="1e2b44cf933e93a3fe1935c83ebfa89fe04f0b8a9dc43cec070f6e4ac910802e" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.687487 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e2b44cf933e93a3fe1935c83ebfa89fe04f0b8a9dc43cec070f6e4ac910802e"} err="failed to get container status \"1e2b44cf933e93a3fe1935c83ebfa89fe04f0b8a9dc43cec070f6e4ac910802e\": rpc error: code = NotFound desc = could not find container \"1e2b44cf933e93a3fe1935c83ebfa89fe04f0b8a9dc43cec070f6e4ac910802e\": container with ID starting with 1e2b44cf933e93a3fe1935c83ebfa89fe04f0b8a9dc43cec070f6e4ac910802e not found: ID does not exist" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.687518 4594 scope.go:117] "RemoveContainer" containerID="63478de88d9dea415d385132bb93a6fb43f25f25f1a0cde8624603e685ca456e" Nov 29 05:43:29 crc kubenswrapper[4594]: E1129 05:43:29.687879 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63478de88d9dea415d385132bb93a6fb43f25f25f1a0cde8624603e685ca456e\": container with ID starting with 63478de88d9dea415d385132bb93a6fb43f25f25f1a0cde8624603e685ca456e not found: ID does not exist" containerID="63478de88d9dea415d385132bb93a6fb43f25f25f1a0cde8624603e685ca456e" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.687914 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63478de88d9dea415d385132bb93a6fb43f25f25f1a0cde8624603e685ca456e"} err="failed to get container status \"63478de88d9dea415d385132bb93a6fb43f25f25f1a0cde8624603e685ca456e\": rpc error: code = NotFound desc = could not find container \"63478de88d9dea415d385132bb93a6fb43f25f25f1a0cde8624603e685ca456e\": container with ID starting with 63478de88d9dea415d385132bb93a6fb43f25f25f1a0cde8624603e685ca456e not found: ID does not exist" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.687940 4594 scope.go:117] "RemoveContainer" containerID="e6656096bebb07d98138098c03032af517ed8e4b6dc43d03d7a57dd38c8d61d6" Nov 29 05:43:29 crc kubenswrapper[4594]: E1129 05:43:29.688477 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6656096bebb07d98138098c03032af517ed8e4b6dc43d03d7a57dd38c8d61d6\": container with ID starting with e6656096bebb07d98138098c03032af517ed8e4b6dc43d03d7a57dd38c8d61d6 not found: ID does not exist" containerID="e6656096bebb07d98138098c03032af517ed8e4b6dc43d03d7a57dd38c8d61d6" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.688504 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6656096bebb07d98138098c03032af517ed8e4b6dc43d03d7a57dd38c8d61d6"} err="failed to get container status \"e6656096bebb07d98138098c03032af517ed8e4b6dc43d03d7a57dd38c8d61d6\": rpc error: code = NotFound desc = could not find container \"e6656096bebb07d98138098c03032af517ed8e4b6dc43d03d7a57dd38c8d61d6\": container with ID starting with e6656096bebb07d98138098c03032af517ed8e4b6dc43d03d7a57dd38c8d61d6 not found: ID does not exist" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.823823 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n4msp"] Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.830058 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n4msp"] Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.882110 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.983602 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.983644 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4e4fe89a-3363-4724-89a8-a9b61fe6f039-prometheus-metric-storage-rulefiles-0\") pod \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.983668 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4e4fe89a-3363-4724-89a8-a9b61fe6f039-config-out\") pod \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.983700 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-config\") pod \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.983880 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdq58\" (UniqueName: \"kubernetes.io/projected/4e4fe89a-3363-4724-89a8-a9b61fe6f039-kube-api-access-gdq58\") pod \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.983925 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-web-config\") pod \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.983948 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4e4fe89a-3363-4724-89a8-a9b61fe6f039-tls-assets\") pod \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.983966 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-thanos-prometheus-http-client-file\") pod \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\" (UID: \"4e4fe89a-3363-4724-89a8-a9b61fe6f039\") " Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.984300 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e4fe89a-3363-4724-89a8-a9b61fe6f039-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "4e4fe89a-3363-4724-89a8-a9b61fe6f039" (UID: "4e4fe89a-3363-4724-89a8-a9b61fe6f039"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.984540 4594 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4e4fe89a-3363-4724-89a8-a9b61fe6f039-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.989408 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4e4fe89a-3363-4724-89a8-a9b61fe6f039" (UID: "4e4fe89a-3363-4724-89a8-a9b61fe6f039"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.990338 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-config" (OuterVolumeSpecName: "config") pod "4e4fe89a-3363-4724-89a8-a9b61fe6f039" (UID: "4e4fe89a-3363-4724-89a8-a9b61fe6f039"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.990381 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e4fe89a-3363-4724-89a8-a9b61fe6f039-kube-api-access-gdq58" (OuterVolumeSpecName: "kube-api-access-gdq58") pod "4e4fe89a-3363-4724-89a8-a9b61fe6f039" (UID: "4e4fe89a-3363-4724-89a8-a9b61fe6f039"). InnerVolumeSpecName "kube-api-access-gdq58". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.990446 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e4fe89a-3363-4724-89a8-a9b61fe6f039-config-out" (OuterVolumeSpecName: "config-out") pod "4e4fe89a-3363-4724-89a8-a9b61fe6f039" (UID: "4e4fe89a-3363-4724-89a8-a9b61fe6f039"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:43:29 crc kubenswrapper[4594]: I1129 05:43:29.990858 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e4fe89a-3363-4724-89a8-a9b61fe6f039-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4e4fe89a-3363-4724-89a8-a9b61fe6f039" (UID: "4e4fe89a-3363-4724-89a8-a9b61fe6f039"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.001937 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "4e4fe89a-3363-4724-89a8-a9b61fe6f039" (UID: "4e4fe89a-3363-4724-89a8-a9b61fe6f039"). InnerVolumeSpecName "pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.008406 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-web-config" (OuterVolumeSpecName: "web-config") pod "4e4fe89a-3363-4724-89a8-a9b61fe6f039" (UID: "4e4fe89a-3363-4724-89a8-a9b61fe6f039"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.086816 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdq58\" (UniqueName: \"kubernetes.io/projected/4e4fe89a-3363-4724-89a8-a9b61fe6f039-kube-api-access-gdq58\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.086847 4594 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-web-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.086863 4594 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4e4fe89a-3363-4724-89a8-a9b61fe6f039-tls-assets\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.086875 4594 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.086915 4594 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") on node \"crc\" " Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.086928 4594 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4e4fe89a-3363-4724-89a8-a9b61fe6f039-config-out\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.086937 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e4fe89a-3363-4724-89a8-a9b61fe6f039-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.098160 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" path="/var/lib/kubelet/pods/398463af-ebf5-4586-8a81-bc9b7cdce9c0/volumes" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.107899 4594 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.108050 4594 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d") on node "crc" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.189233 4594 reconciler_common.go:293] "Volume detached for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.483145 4594 generic.go:334] "Generic (PLEG): container finished" podID="f63cf943-e9c0-4f70-9f7b-8ecf859c92ae" containerID="7e0230e41b5936cf69830230f2b1b8d114fe4b3b43c56335758c7e6305e211a1" exitCode=0 Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.483226 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wtghb" event={"ID":"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae","Type":"ContainerDied","Data":"7e0230e41b5936cf69830230f2b1b8d114fe4b3b43c56335758c7e6305e211a1"} Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.495653 4594 generic.go:334] "Generic (PLEG): container finished" podID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerID="a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430" exitCode=0 Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.495685 4594 generic.go:334] "Generic (PLEG): container finished" podID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerID="5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a" exitCode=0 Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.495713 4594 generic.go:334] "Generic (PLEG): container finished" podID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerID="6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a" exitCode=0 Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.495742 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.495766 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4e4fe89a-3363-4724-89a8-a9b61fe6f039","Type":"ContainerDied","Data":"a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430"} Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.495840 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4e4fe89a-3363-4724-89a8-a9b61fe6f039","Type":"ContainerDied","Data":"5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a"} Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.495853 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4e4fe89a-3363-4724-89a8-a9b61fe6f039","Type":"ContainerDied","Data":"6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a"} Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.495863 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4e4fe89a-3363-4724-89a8-a9b61fe6f039","Type":"ContainerDied","Data":"3f31231729b0a8d86e298a73e5b59f8c5108af4c552716077e571032c6d1d3ab"} Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.495884 4594 scope.go:117] "RemoveContainer" containerID="a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.526217 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.538628 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.550749 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 05:43:30 crc kubenswrapper[4594]: E1129 05:43:30.551199 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="prometheus" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551219 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="prometheus" Nov 29 05:43:30 crc kubenswrapper[4594]: E1129 05:43:30.551229 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="config-reloader" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551236 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="config-reloader" Nov 29 05:43:30 crc kubenswrapper[4594]: E1129 05:43:30.551286 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37693f9-769d-409e-a90a-946d5fbe00d9" containerName="mariadb-database-create" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551294 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37693f9-769d-409e-a90a-946d5fbe00d9" containerName="mariadb-database-create" Nov 29 05:43:30 crc kubenswrapper[4594]: E1129 05:43:30.551306 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" containerName="registry-server" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551312 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" containerName="registry-server" Nov 29 05:43:30 crc kubenswrapper[4594]: E1129 05:43:30.551327 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="thanos-sidecar" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551334 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="thanos-sidecar" Nov 29 05:43:30 crc kubenswrapper[4594]: E1129 05:43:30.551343 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" containerName="extract-utilities" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551350 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" containerName="extract-utilities" Nov 29 05:43:30 crc kubenswrapper[4594]: E1129 05:43:30.551365 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b1647cf-8657-466a-8579-2ce1f5ac45a3" containerName="mariadb-account-create-update" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551371 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1647cf-8657-466a-8579-2ce1f5ac45a3" containerName="mariadb-account-create-update" Nov 29 05:43:30 crc kubenswrapper[4594]: E1129 05:43:30.551382 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="init-config-reloader" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551388 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="init-config-reloader" Nov 29 05:43:30 crc kubenswrapper[4594]: E1129 05:43:30.551397 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" containerName="extract-content" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551402 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" containerName="extract-content" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551608 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="prometheus" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551627 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37693f9-769d-409e-a90a-946d5fbe00d9" containerName="mariadb-database-create" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551640 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="thanos-sidecar" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551652 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" containerName="config-reloader" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551662 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b1647cf-8657-466a-8579-2ce1f5ac45a3" containerName="mariadb-account-create-update" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.551669 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="398463af-ebf5-4586-8a81-bc9b7cdce9c0" containerName="registry-server" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.553427 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.556453 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.556677 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.556759 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.556829 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.563633 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.564176 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-w8z2f" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.564669 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.565604 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.699321 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.699398 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.699430 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.699494 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.699519 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.699654 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.699706 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-config\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.699738 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.699754 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.699945 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j796b\" (UniqueName: \"kubernetes.io/projected/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-kube-api-access-j796b\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.700039 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.801894 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j796b\" (UniqueName: \"kubernetes.io/projected/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-kube-api-access-j796b\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.802222 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.802284 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.802321 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.802348 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.802405 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.802432 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.802462 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.802483 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-config\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.802501 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.802516 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.803380 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.812045 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.812095 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-config\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.812988 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.814763 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.815269 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.815294 4594 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.815419 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/740def2cd7269ac7c6453fb83dbbc5807ffe59ef523e7a581c6b5220b8504e7f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.817889 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.823472 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j796b\" (UniqueName: \"kubernetes.io/projected/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-kube-api-access-j796b\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.830418 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.837694 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.862764 4594 scope.go:117] "RemoveContainer" containerID="5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.911522 4594 scope.go:117] "RemoveContainer" containerID="6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.940642 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"prometheus-metric-storage-0\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:30 crc kubenswrapper[4594]: I1129 05:43:30.973309 4594 scope.go:117] "RemoveContainer" containerID="a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.004304 4594 scope.go:117] "RemoveContainer" containerID="a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430" Nov 29 05:43:31 crc kubenswrapper[4594]: E1129 05:43:31.004803 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430\": container with ID starting with a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430 not found: ID does not exist" containerID="a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.004853 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430"} err="failed to get container status \"a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430\": rpc error: code = NotFound desc = could not find container \"a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430\": container with ID starting with a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430 not found: ID does not exist" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.004886 4594 scope.go:117] "RemoveContainer" containerID="5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a" Nov 29 05:43:31 crc kubenswrapper[4594]: E1129 05:43:31.005349 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a\": container with ID starting with 5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a not found: ID does not exist" containerID="5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.005389 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a"} err="failed to get container status \"5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a\": rpc error: code = NotFound desc = could not find container \"5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a\": container with ID starting with 5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a not found: ID does not exist" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.005406 4594 scope.go:117] "RemoveContainer" containerID="6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a" Nov 29 05:43:31 crc kubenswrapper[4594]: E1129 05:43:31.005678 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a\": container with ID starting with 6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a not found: ID does not exist" containerID="6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.005708 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a"} err="failed to get container status \"6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a\": rpc error: code = NotFound desc = could not find container \"6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a\": container with ID starting with 6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a not found: ID does not exist" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.005726 4594 scope.go:117] "RemoveContainer" containerID="a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e" Nov 29 05:43:31 crc kubenswrapper[4594]: E1129 05:43:31.007500 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e\": container with ID starting with a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e not found: ID does not exist" containerID="a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.007541 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e"} err="failed to get container status \"a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e\": rpc error: code = NotFound desc = could not find container \"a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e\": container with ID starting with a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e not found: ID does not exist" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.007580 4594 scope.go:117] "RemoveContainer" containerID="a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.007994 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430"} err="failed to get container status \"a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430\": rpc error: code = NotFound desc = could not find container \"a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430\": container with ID starting with a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430 not found: ID does not exist" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.008084 4594 scope.go:117] "RemoveContainer" containerID="5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.008356 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a"} err="failed to get container status \"5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a\": rpc error: code = NotFound desc = could not find container \"5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a\": container with ID starting with 5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a not found: ID does not exist" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.008421 4594 scope.go:117] "RemoveContainer" containerID="6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.008937 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a"} err="failed to get container status \"6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a\": rpc error: code = NotFound desc = could not find container \"6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a\": container with ID starting with 6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a not found: ID does not exist" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.009008 4594 scope.go:117] "RemoveContainer" containerID="a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.009550 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e"} err="failed to get container status \"a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e\": rpc error: code = NotFound desc = could not find container \"a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e\": container with ID starting with a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e not found: ID does not exist" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.009585 4594 scope.go:117] "RemoveContainer" containerID="a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.010052 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430"} err="failed to get container status \"a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430\": rpc error: code = NotFound desc = could not find container \"a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430\": container with ID starting with a05f858d08ad1ac63b4c149b4ad984bfe6f13bdd53d00d2f44f4b2906b2e1430 not found: ID does not exist" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.010072 4594 scope.go:117] "RemoveContainer" containerID="5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.010383 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a"} err="failed to get container status \"5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a\": rpc error: code = NotFound desc = could not find container \"5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a\": container with ID starting with 5814a998d2de6d2998082fe37b44771b97b0f348fcca1915adbd53597116025a not found: ID does not exist" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.010415 4594 scope.go:117] "RemoveContainer" containerID="6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.010621 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a"} err="failed to get container status \"6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a\": rpc error: code = NotFound desc = could not find container \"6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a\": container with ID starting with 6d1c4ce5f27e08c3ba707abeac8cf0a5481b8d43e0c65692bc7cc6019efdb31a not found: ID does not exist" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.010642 4594 scope.go:117] "RemoveContainer" containerID="a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.010817 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e"} err="failed to get container status \"a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e\": rpc error: code = NotFound desc = could not find container \"a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e\": container with ID starting with a16740da5b9f233ab8f5eceecfe70c451e89542cb502445ca3143bb7accd8e1e not found: ID does not exist" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.172225 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.508021 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"21284ba8-9492-4f6e-84c8-88d3844f386b","Type":"ContainerStarted","Data":"ea1b0a5132d5fcec20812000f81c1712c208f65675fa811f0bbc372599199359"} Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.508367 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"21284ba8-9492-4f6e-84c8-88d3844f386b","Type":"ContainerStarted","Data":"584f3c8bc6c485d1b47d7b6f1d49351392431980e0ab9a2a3e207d2c0dab89a2"} Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.521734 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.8541476270000001 podStartE2EDuration="4.521716533s" podCreationTimestamp="2025-11-29 05:43:27 +0000 UTC" firstStartedPulling="2025-11-29 05:43:28.243812908 +0000 UTC m=+932.484322128" lastFinishedPulling="2025-11-29 05:43:30.911381824 +0000 UTC m=+935.151891034" observedRunningTime="2025-11-29 05:43:31.521038235 +0000 UTC m=+935.761547456" watchObservedRunningTime="2025-11-29 05:43:31.521716533 +0000 UTC m=+935.762225752" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.570981 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 05:43:31 crc kubenswrapper[4594]: W1129 05:43:31.577085 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e44006f_3a80_4bf2_aac6_8d5c664d0db8.slice/crio-9a07a121ecff1d9e0616fe2b62eb67c4d6f3b7903061218ed0135e9f517fb32f WatchSource:0}: Error finding container 9a07a121ecff1d9e0616fe2b62eb67c4d6f3b7903061218ed0135e9f517fb32f: Status 404 returned error can't find the container with id 9a07a121ecff1d9e0616fe2b62eb67c4d6f3b7903061218ed0135e9f517fb32f Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.624209 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.629712 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15a27bc9-a74a-4123-b693-baf16a0ed04d-etc-swift\") pod \"swift-storage-0\" (UID: \"15a27bc9-a74a-4123-b693-baf16a0ed04d\") " pod="openstack/swift-storage-0" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.714750 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.773158 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.832870 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-combined-ca-bundle\") pod \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.832957 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-etc-swift\") pod \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.833053 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drqp9\" (UniqueName: \"kubernetes.io/projected/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-kube-api-access-drqp9\") pod \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.833096 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-ring-data-devices\") pod \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.833150 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-scripts\") pod \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.833205 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-dispersionconf\") pod \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.833307 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-swiftconf\") pod \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\" (UID: \"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae\") " Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.833669 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae" (UID: "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.834103 4594 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.836100 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae" (UID: "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.838232 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-kube-api-access-drqp9" (OuterVolumeSpecName: "kube-api-access-drqp9") pod "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae" (UID: "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae"). InnerVolumeSpecName "kube-api-access-drqp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.852182 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae" (UID: "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.856402 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-scripts" (OuterVolumeSpecName: "scripts") pod "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae" (UID: "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.861979 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae" (UID: "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.863490 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae" (UID: "f63cf943-e9c0-4f70-9f7b-8ecf859c92ae"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.935976 4594 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.936008 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.936022 4594 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.936031 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drqp9\" (UniqueName: \"kubernetes.io/projected/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-kube-api-access-drqp9\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.936044 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:31 crc kubenswrapper[4594]: I1129 05:43:31.936052 4594 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f63cf943-e9c0-4f70-9f7b-8ecf859c92ae-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:32 crc kubenswrapper[4594]: I1129 05:43:32.097279 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e4fe89a-3363-4724-89a8-a9b61fe6f039" path="/var/lib/kubelet/pods/4e4fe89a-3363-4724-89a8-a9b61fe6f039/volumes" Nov 29 05:43:32 crc kubenswrapper[4594]: I1129 05:43:32.118776 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 29 05:43:32 crc kubenswrapper[4594]: I1129 05:43:32.118852 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 29 05:43:32 crc kubenswrapper[4594]: I1129 05:43:32.238754 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 29 05:43:32 crc kubenswrapper[4594]: I1129 05:43:32.260413 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 29 05:43:32 crc kubenswrapper[4594]: I1129 05:43:32.515856 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"7da06d8d9f45a6b4ce7a0a5c99ff82c04a711b14fd8a009e6b39ca645e30b55e"} Nov 29 05:43:32 crc kubenswrapper[4594]: I1129 05:43:32.517752 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wtghb" event={"ID":"f63cf943-e9c0-4f70-9f7b-8ecf859c92ae","Type":"ContainerDied","Data":"649cbd3d1b8b7b0568a202886c02dc95a86206f3412bd2d390c8081a136114d7"} Nov 29 05:43:32 crc kubenswrapper[4594]: I1129 05:43:32.517777 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="649cbd3d1b8b7b0568a202886c02dc95a86206f3412bd2d390c8081a136114d7" Nov 29 05:43:32 crc kubenswrapper[4594]: I1129 05:43:32.517833 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wtghb" Nov 29 05:43:32 crc kubenswrapper[4594]: I1129 05:43:32.520489 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e44006f-3a80-4bf2-aac6-8d5c664d0db8","Type":"ContainerStarted","Data":"9a07a121ecff1d9e0616fe2b62eb67c4d6f3b7903061218ed0135e9f517fb32f"} Nov 29 05:43:32 crc kubenswrapper[4594]: I1129 05:43:32.520518 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 29 05:43:32 crc kubenswrapper[4594]: I1129 05:43:32.626586 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 29 05:43:33 crc kubenswrapper[4594]: I1129 05:43:33.753012 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:33 crc kubenswrapper[4594]: I1129 05:43:33.827413 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:34 crc kubenswrapper[4594]: I1129 05:43:34.541491 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"9bcd82f0a185d068bb14fba8063db149b58aab53414316b762f33170be961a93"} Nov 29 05:43:34 crc kubenswrapper[4594]: I1129 05:43:34.541541 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"5942bfe86fb650b4f303ea3638d80610de7689944a00b8fb1c84ca56e5dfcfd7"} Nov 29 05:43:34 crc kubenswrapper[4594]: I1129 05:43:34.541554 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"902fa427bb4b25fa1af4742f476f01c6685e790d394aab368349626393657f9d"} Nov 29 05:43:34 crc kubenswrapper[4594]: I1129 05:43:34.544743 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e44006f-3a80-4bf2-aac6-8d5c664d0db8","Type":"ContainerStarted","Data":"e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e"} Nov 29 05:43:35 crc kubenswrapper[4594]: I1129 05:43:35.557223 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"134d32c693665edbeafcfbfc1fd4616b62a562ff72b842f534cf95c765afbc00"} Nov 29 05:43:36 crc kubenswrapper[4594]: I1129 05:43:36.358781 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4q76"] Nov 29 05:43:36 crc kubenswrapper[4594]: I1129 05:43:36.359364 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z4q76" podUID="3f256392-25b5-4ead-8ec1-8973b757b86b" containerName="registry-server" containerID="cri-o://6270eb6d93036d48cf1d38e80a5444a536fc0315b3782ae2ccddc15310197438" gracePeriod=2 Nov 29 05:43:36 crc kubenswrapper[4594]: I1129 05:43:36.568179 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"90d310dbb2911b4fc71aacc49305fbc6cd87e8b0ae83d820dffd4890dfe96341"} Nov 29 05:43:36 crc kubenswrapper[4594]: I1129 05:43:36.574321 4594 generic.go:334] "Generic (PLEG): container finished" podID="3f256392-25b5-4ead-8ec1-8973b757b86b" containerID="6270eb6d93036d48cf1d38e80a5444a536fc0315b3782ae2ccddc15310197438" exitCode=0 Nov 29 05:43:36 crc kubenswrapper[4594]: I1129 05:43:36.574353 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4q76" event={"ID":"3f256392-25b5-4ead-8ec1-8973b757b86b","Type":"ContainerDied","Data":"6270eb6d93036d48cf1d38e80a5444a536fc0315b3782ae2ccddc15310197438"} Nov 29 05:43:36 crc kubenswrapper[4594]: I1129 05:43:36.844903 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:36 crc kubenswrapper[4594]: I1129 05:43:36.935537 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5t5p\" (UniqueName: \"kubernetes.io/projected/3f256392-25b5-4ead-8ec1-8973b757b86b-kube-api-access-r5t5p\") pod \"3f256392-25b5-4ead-8ec1-8973b757b86b\" (UID: \"3f256392-25b5-4ead-8ec1-8973b757b86b\") " Nov 29 05:43:36 crc kubenswrapper[4594]: I1129 05:43:36.935621 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f256392-25b5-4ead-8ec1-8973b757b86b-utilities\") pod \"3f256392-25b5-4ead-8ec1-8973b757b86b\" (UID: \"3f256392-25b5-4ead-8ec1-8973b757b86b\") " Nov 29 05:43:36 crc kubenswrapper[4594]: I1129 05:43:36.935728 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f256392-25b5-4ead-8ec1-8973b757b86b-catalog-content\") pod \"3f256392-25b5-4ead-8ec1-8973b757b86b\" (UID: \"3f256392-25b5-4ead-8ec1-8973b757b86b\") " Nov 29 05:43:36 crc kubenswrapper[4594]: I1129 05:43:36.936646 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f256392-25b5-4ead-8ec1-8973b757b86b-utilities" (OuterVolumeSpecName: "utilities") pod "3f256392-25b5-4ead-8ec1-8973b757b86b" (UID: "3f256392-25b5-4ead-8ec1-8973b757b86b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:43:36 crc kubenswrapper[4594]: I1129 05:43:36.942085 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f256392-25b5-4ead-8ec1-8973b757b86b-kube-api-access-r5t5p" (OuterVolumeSpecName: "kube-api-access-r5t5p") pod "3f256392-25b5-4ead-8ec1-8973b757b86b" (UID: "3f256392-25b5-4ead-8ec1-8973b757b86b"). InnerVolumeSpecName "kube-api-access-r5t5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:36 crc kubenswrapper[4594]: I1129 05:43:36.951662 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f256392-25b5-4ead-8ec1-8973b757b86b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f256392-25b5-4ead-8ec1-8973b757b86b" (UID: "3f256392-25b5-4ead-8ec1-8973b757b86b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:43:37 crc kubenswrapper[4594]: I1129 05:43:37.038221 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f256392-25b5-4ead-8ec1-8973b757b86b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:37 crc kubenswrapper[4594]: I1129 05:43:37.038271 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5t5p\" (UniqueName: \"kubernetes.io/projected/3f256392-25b5-4ead-8ec1-8973b757b86b-kube-api-access-r5t5p\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:37 crc kubenswrapper[4594]: I1129 05:43:37.038284 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f256392-25b5-4ead-8ec1-8973b757b86b-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:37 crc kubenswrapper[4594]: I1129 05:43:37.585028 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"fe9b4d0b495cc4b53b12be2432b8aa5a7fbed4d7f81b66fc2dcbbf5ff0ecd209"} Nov 29 05:43:37 crc kubenswrapper[4594]: I1129 05:43:37.585324 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"6d505a30e642ed7aa7532292dcbe82878c131d3ebb001c73e87382e71699bed3"} Nov 29 05:43:37 crc kubenswrapper[4594]: I1129 05:43:37.585337 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"68c75e927168f5bf4378a651d68044dd5fc1a632cde47155cab6428212d14a4e"} Nov 29 05:43:37 crc kubenswrapper[4594]: I1129 05:43:37.587412 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4q76" event={"ID":"3f256392-25b5-4ead-8ec1-8973b757b86b","Type":"ContainerDied","Data":"216971c001115cb4f8e9a2456daba7d508d19ed91bc0499eef3080f713d22e6e"} Nov 29 05:43:37 crc kubenswrapper[4594]: I1129 05:43:37.587455 4594 scope.go:117] "RemoveContainer" containerID="6270eb6d93036d48cf1d38e80a5444a536fc0315b3782ae2ccddc15310197438" Nov 29 05:43:37 crc kubenswrapper[4594]: I1129 05:43:37.587464 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z4q76" Nov 29 05:43:37 crc kubenswrapper[4594]: I1129 05:43:37.607170 4594 scope.go:117] "RemoveContainer" containerID="18ba91f387c3fde05ea659104bc61c9a7bcada83017d4efe174b2f1652671c60" Nov 29 05:43:37 crc kubenswrapper[4594]: I1129 05:43:37.619209 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4q76"] Nov 29 05:43:37 crc kubenswrapper[4594]: I1129 05:43:37.623615 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4q76"] Nov 29 05:43:37 crc kubenswrapper[4594]: I1129 05:43:37.643206 4594 scope.go:117] "RemoveContainer" containerID="6b1a86eca226be1885e50285e0517144d8ad36cc8dc2787b3d7b86c25c715c00" Nov 29 05:43:38 crc kubenswrapper[4594]: I1129 05:43:38.093651 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f256392-25b5-4ead-8ec1-8973b757b86b" path="/var/lib/kubelet/pods/3f256392-25b5-4ead-8ec1-8973b757b86b/volumes" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.608285 4594 generic.go:334] "Generic (PLEG): container finished" podID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerID="e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e" exitCode=0 Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.608371 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e44006f-3a80-4bf2-aac6-8d5c664d0db8","Type":"ContainerDied","Data":"e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e"} Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.631690 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"36817874d3dd59c73dfb218521370d380ce1b8f5177ed1f095bc4cb8be17e418"} Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.631755 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"39aef6a491c23a54ab2b4bdf920b3f7d029a6d94191f5fae2c691c18f66457be"} Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.631768 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"be16b5bf00c7485178a4c6f1c8fbd677a73b44874afa22c1f1312937a65679a5"} Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.631778 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"0edabd05ec5fe3e5950795becf6970018544b30a94714ede8734f6b6afa6018e"} Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.631789 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"accd9b938d57a822f8da0e5a7bfd928c7898f977a5776ee7cb250d328c9a7c5a"} Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.631818 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"746558235e194385688155f9b0230ccd8e8e682d1c1c7fa04fc63e0e89c9db42"} Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.631833 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15a27bc9-a74a-4123-b693-baf16a0ed04d","Type":"ContainerStarted","Data":"0ad372690f6d1dff04a6aa907b5335fa5aa1dd47a9282752dc5d01099cd632f5"} Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.690241 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.429714428 podStartE2EDuration="25.690223646s" podCreationTimestamp="2025-11-29 05:43:14 +0000 UTC" firstStartedPulling="2025-11-29 05:43:32.26804488 +0000 UTC m=+936.508554100" lastFinishedPulling="2025-11-29 05:43:38.528554098 +0000 UTC m=+942.769063318" observedRunningTime="2025-11-29 05:43:39.679551792 +0000 UTC m=+943.920061012" watchObservedRunningTime="2025-11-29 05:43:39.690223646 +0000 UTC m=+943.930732866" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.923711 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-72vg5"] Nov 29 05:43:39 crc kubenswrapper[4594]: E1129 05:43:39.924113 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f256392-25b5-4ead-8ec1-8973b757b86b" containerName="extract-utilities" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.924135 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f256392-25b5-4ead-8ec1-8973b757b86b" containerName="extract-utilities" Nov 29 05:43:39 crc kubenswrapper[4594]: E1129 05:43:39.924144 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f256392-25b5-4ead-8ec1-8973b757b86b" containerName="extract-content" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.924152 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f256392-25b5-4ead-8ec1-8973b757b86b" containerName="extract-content" Nov 29 05:43:39 crc kubenswrapper[4594]: E1129 05:43:39.924163 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63cf943-e9c0-4f70-9f7b-8ecf859c92ae" containerName="swift-ring-rebalance" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.924170 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63cf943-e9c0-4f70-9f7b-8ecf859c92ae" containerName="swift-ring-rebalance" Nov 29 05:43:39 crc kubenswrapper[4594]: E1129 05:43:39.924183 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f256392-25b5-4ead-8ec1-8973b757b86b" containerName="registry-server" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.924189 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f256392-25b5-4ead-8ec1-8973b757b86b" containerName="registry-server" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.924410 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f63cf943-e9c0-4f70-9f7b-8ecf859c92ae" containerName="swift-ring-rebalance" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.924448 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f256392-25b5-4ead-8ec1-8973b757b86b" containerName="registry-server" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.925368 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.928276 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.941006 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-72vg5"] Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.988535 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-ovsdbserver-sb\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.988602 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnw98\" (UniqueName: \"kubernetes.io/projected/ff6b46fc-e420-41b4-b2f1-06ff16d73595-kube-api-access-wnw98\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.988639 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-dns-swift-storage-0\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.988670 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-ovsdbserver-nb\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.988692 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-dns-svc\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:39 crc kubenswrapper[4594]: I1129 05:43:39.988759 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-config\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.090795 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-ovsdbserver-sb\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.090870 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnw98\" (UniqueName: \"kubernetes.io/projected/ff6b46fc-e420-41b4-b2f1-06ff16d73595-kube-api-access-wnw98\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.090910 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-dns-swift-storage-0\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.090951 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-ovsdbserver-nb\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.090972 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-dns-svc\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.091044 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-config\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.091862 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-ovsdbserver-sb\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.092172 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-ovsdbserver-nb\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.092538 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-dns-swift-storage-0\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.093664 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-dns-svc\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.094645 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-config\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.108703 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnw98\" (UniqueName: \"kubernetes.io/projected/ff6b46fc-e420-41b4-b2f1-06ff16d73595-kube-api-access-wnw98\") pod \"dnsmasq-dns-55b99bf79c-72vg5\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.247216 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.624447 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-72vg5"] Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.645596 4594 generic.go:334] "Generic (PLEG): container finished" podID="47b6950d-0d97-486d-aaec-2a9eaaf74027" containerID="18588c32fb9b6698f37b98a5ca9f655c67c03d703b3e049a536f58ced9b1e325" exitCode=0 Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.645691 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"47b6950d-0d97-486d-aaec-2a9eaaf74027","Type":"ContainerDied","Data":"18588c32fb9b6698f37b98a5ca9f655c67c03d703b3e049a536f58ced9b1e325"} Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.649514 4594 generic.go:334] "Generic (PLEG): container finished" podID="9667e68c-f715-4663-bddb-53c53d3a593d" containerID="7c1bd83b41e6b5f5154d21d1cf867e01c0a6b29abac556440a9d0952903f05fc" exitCode=0 Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.649570 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"9667e68c-f715-4663-bddb-53c53d3a593d","Type":"ContainerDied","Data":"7c1bd83b41e6b5f5154d21d1cf867e01c0a6b29abac556440a9d0952903f05fc"} Nov 29 05:43:40 crc kubenswrapper[4594]: I1129 05:43:40.653324 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e44006f-3a80-4bf2-aac6-8d5c664d0db8","Type":"ContainerStarted","Data":"166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b"} Nov 29 05:43:41 crc kubenswrapper[4594]: I1129 05:43:41.665977 4594 generic.go:334] "Generic (PLEG): container finished" podID="ff6b46fc-e420-41b4-b2f1-06ff16d73595" containerID="f628d58237ac93ac4654f52cca472d9e568a6dd0158ee585f6ff8558534a7d70" exitCode=0 Nov 29 05:43:41 crc kubenswrapper[4594]: I1129 05:43:41.666384 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" event={"ID":"ff6b46fc-e420-41b4-b2f1-06ff16d73595","Type":"ContainerDied","Data":"f628d58237ac93ac4654f52cca472d9e568a6dd0158ee585f6ff8558534a7d70"} Nov 29 05:43:41 crc kubenswrapper[4594]: I1129 05:43:41.666421 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" event={"ID":"ff6b46fc-e420-41b4-b2f1-06ff16d73595","Type":"ContainerStarted","Data":"e108da01ae5da191e328593e817fd401637521988ea5a13bb9b20b18cdf4ad1a"} Nov 29 05:43:41 crc kubenswrapper[4594]: I1129 05:43:41.670709 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"47b6950d-0d97-486d-aaec-2a9eaaf74027","Type":"ContainerStarted","Data":"8d71481afa04edc87a81fb7329f95a03dd0d6110ec5d8956af3e7a934aa4fb1c"} Nov 29 05:43:41 crc kubenswrapper[4594]: I1129 05:43:41.671177 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:43:41 crc kubenswrapper[4594]: I1129 05:43:41.673501 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"9667e68c-f715-4663-bddb-53c53d3a593d","Type":"ContainerStarted","Data":"8bbd3dffb295e3c378adbb75be7901413a3900e23fae2774fae4b8c8497de72a"} Nov 29 05:43:41 crc kubenswrapper[4594]: I1129 05:43:41.673934 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:43:41 crc kubenswrapper[4594]: I1129 05:43:41.716484 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.724584039 podStartE2EDuration="1m24.716464565s" podCreationTimestamp="2025-11-29 05:42:17 +0000 UTC" firstStartedPulling="2025-11-29 05:42:19.237983913 +0000 UTC m=+863.478493133" lastFinishedPulling="2025-11-29 05:43:06.229864439 +0000 UTC m=+910.470373659" observedRunningTime="2025-11-29 05:43:41.711353677 +0000 UTC m=+945.951862896" watchObservedRunningTime="2025-11-29 05:43:41.716464565 +0000 UTC m=+945.956973786" Nov 29 05:43:41 crc kubenswrapper[4594]: I1129 05:43:41.739443 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=37.47480146 podStartE2EDuration="1m24.739423184s" podCreationTimestamp="2025-11-29 05:42:17 +0000 UTC" firstStartedPulling="2025-11-29 05:42:19.298299977 +0000 UTC m=+863.538809197" lastFinishedPulling="2025-11-29 05:43:06.562921701 +0000 UTC m=+910.803430921" observedRunningTime="2025-11-29 05:43:41.736701101 +0000 UTC m=+945.977210310" watchObservedRunningTime="2025-11-29 05:43:41.739423184 +0000 UTC m=+945.979932403" Nov 29 05:43:42 crc kubenswrapper[4594]: I1129 05:43:42.684488 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" event={"ID":"ff6b46fc-e420-41b4-b2f1-06ff16d73595","Type":"ContainerStarted","Data":"5c4fa1739f79e3406180b7ae3f1b98d8a8d214f5d1ba328a9114a9710ee6e324"} Nov 29 05:43:42 crc kubenswrapper[4594]: I1129 05:43:42.685788 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:42 crc kubenswrapper[4594]: I1129 05:43:42.687728 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e44006f-3a80-4bf2-aac6-8d5c664d0db8","Type":"ContainerStarted","Data":"fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a"} Nov 29 05:43:42 crc kubenswrapper[4594]: I1129 05:43:42.687868 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e44006f-3a80-4bf2-aac6-8d5c664d0db8","Type":"ContainerStarted","Data":"be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081"} Nov 29 05:43:42 crc kubenswrapper[4594]: I1129 05:43:42.706930 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" podStartSLOduration=3.7069099899999998 podStartE2EDuration="3.70690999s" podCreationTimestamp="2025-11-29 05:43:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:43:42.701412625 +0000 UTC m=+946.941921845" watchObservedRunningTime="2025-11-29 05:43:42.70690999 +0000 UTC m=+946.947419210" Nov 29 05:43:42 crc kubenswrapper[4594]: I1129 05:43:42.732319 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=12.732299423 podStartE2EDuration="12.732299423s" podCreationTimestamp="2025-11-29 05:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:43:42.725695885 +0000 UTC m=+946.966205105" watchObservedRunningTime="2025-11-29 05:43:42.732299423 +0000 UTC m=+946.972808643" Nov 29 05:43:42 crc kubenswrapper[4594]: I1129 05:43:42.861482 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 29 05:43:42 crc kubenswrapper[4594]: I1129 05:43:42.894041 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:43:42 crc kubenswrapper[4594]: I1129 05:43:42.900772 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zvkw9" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.116668 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-b9h9k-config-hdqq7"] Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.118056 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.123545 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.130409 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b9h9k-config-hdqq7"] Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.257343 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-run-ovn\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.257392 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-log-ovn\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.257526 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz7nj\" (UniqueName: \"kubernetes.io/projected/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-kube-api-access-mz7nj\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.257803 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-additional-scripts\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.257896 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-run\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.257968 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-scripts\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.360121 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz7nj\" (UniqueName: \"kubernetes.io/projected/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-kube-api-access-mz7nj\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.360368 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-additional-scripts\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.360438 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-run\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.360509 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-scripts\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.360546 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-run-ovn\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.360602 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-log-ovn\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.360786 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-run\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.360824 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-log-ovn\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.360863 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-run-ovn\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.361499 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-additional-scripts\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.363179 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-scripts\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.379366 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz7nj\" (UniqueName: \"kubernetes.io/projected/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-kube-api-access-mz7nj\") pod \"ovn-controller-b9h9k-config-hdqq7\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.433969 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:43 crc kubenswrapper[4594]: I1129 05:43:43.896239 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b9h9k-config-hdqq7"] Nov 29 05:43:44 crc kubenswrapper[4594]: I1129 05:43:44.706268 4594 generic.go:334] "Generic (PLEG): container finished" podID="a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a" containerID="80e2351fc77071ab3107c08b054317392fa7aec8869a923516c2e7fc3d0ce807" exitCode=0 Nov 29 05:43:44 crc kubenswrapper[4594]: I1129 05:43:44.706400 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b9h9k-config-hdqq7" event={"ID":"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a","Type":"ContainerDied","Data":"80e2351fc77071ab3107c08b054317392fa7aec8869a923516c2e7fc3d0ce807"} Nov 29 05:43:44 crc kubenswrapper[4594]: I1129 05:43:44.706795 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b9h9k-config-hdqq7" event={"ID":"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a","Type":"ContainerStarted","Data":"f12bf97f5100747eb4c8fc6faeff0af5646bdd10444cc3436787f72f38b72e4f"} Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.031387 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.117962 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-log-ovn\") pod \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.118092 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-run-ovn\") pod \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.118081 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a" (UID: "a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.118142 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-run\") pod \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.118194 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a" (UID: "a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.118266 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-scripts\") pod \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.118274 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-run" (OuterVolumeSpecName: "var-run") pod "a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a" (UID: "a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.118372 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-additional-scripts\") pod \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.118434 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz7nj\" (UniqueName: \"kubernetes.io/projected/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-kube-api-access-mz7nj\") pod \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\" (UID: \"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a\") " Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.119064 4594 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.119083 4594 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-run\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.119093 4594 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.119201 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a" (UID: "a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.119456 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-scripts" (OuterVolumeSpecName: "scripts") pod "a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a" (UID: "a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.124782 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-kube-api-access-mz7nj" (OuterVolumeSpecName: "kube-api-access-mz7nj") pod "a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a" (UID: "a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a"). InnerVolumeSpecName "kube-api-access-mz7nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.173278 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.173349 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.182912 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.221371 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.221415 4594 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.221429 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz7nj\" (UniqueName: \"kubernetes.io/projected/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a-kube-api-access-mz7nj\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.725136 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b9h9k-config-hdqq7" event={"ID":"a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a","Type":"ContainerDied","Data":"f12bf97f5100747eb4c8fc6faeff0af5646bdd10444cc3436787f72f38b72e4f"} Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.725609 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f12bf97f5100747eb4c8fc6faeff0af5646bdd10444cc3436787f72f38b72e4f" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.725278 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b9h9k-config-hdqq7" Nov 29 05:43:46 crc kubenswrapper[4594]: I1129 05:43:46.729781 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Nov 29 05:43:47 crc kubenswrapper[4594]: I1129 05:43:47.160458 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-b9h9k-config-hdqq7"] Nov 29 05:43:47 crc kubenswrapper[4594]: I1129 05:43:47.183184 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-b9h9k-config-hdqq7"] Nov 29 05:43:48 crc kubenswrapper[4594]: I1129 05:43:48.099212 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a" path="/var/lib/kubelet/pods/a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a/volumes" Nov 29 05:43:50 crc kubenswrapper[4594]: I1129 05:43:50.248472 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:43:50 crc kubenswrapper[4594]: I1129 05:43:50.307004 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-hpsj2"] Nov 29 05:43:50 crc kubenswrapper[4594]: I1129 05:43:50.307352 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" podUID="e6036791-b766-491c-807f-cdb3f9616288" containerName="dnsmasq-dns" containerID="cri-o://666b7102786b97c74f0b151f2c001e0f396afe8041aa9ab8a907a15084474eb0" gracePeriod=10 Nov 29 05:43:50 crc kubenswrapper[4594]: I1129 05:43:50.777275 4594 generic.go:334] "Generic (PLEG): container finished" podID="e6036791-b766-491c-807f-cdb3f9616288" containerID="666b7102786b97c74f0b151f2c001e0f396afe8041aa9ab8a907a15084474eb0" exitCode=0 Nov 29 05:43:50 crc kubenswrapper[4594]: I1129 05:43:50.777541 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" event={"ID":"e6036791-b766-491c-807f-cdb3f9616288","Type":"ContainerDied","Data":"666b7102786b97c74f0b151f2c001e0f396afe8041aa9ab8a907a15084474eb0"} Nov 29 05:43:50 crc kubenswrapper[4594]: I1129 05:43:50.872397 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.004210 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-ovsdbserver-nb\") pod \"e6036791-b766-491c-807f-cdb3f9616288\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.004406 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-dns-svc\") pod \"e6036791-b766-491c-807f-cdb3f9616288\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.004430 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-config\") pod \"e6036791-b766-491c-807f-cdb3f9616288\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.005689 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-ovsdbserver-sb\") pod \"e6036791-b766-491c-807f-cdb3f9616288\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.005831 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf2ch\" (UniqueName: \"kubernetes.io/projected/e6036791-b766-491c-807f-cdb3f9616288-kube-api-access-vf2ch\") pod \"e6036791-b766-491c-807f-cdb3f9616288\" (UID: \"e6036791-b766-491c-807f-cdb3f9616288\") " Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.015639 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6036791-b766-491c-807f-cdb3f9616288-kube-api-access-vf2ch" (OuterVolumeSpecName: "kube-api-access-vf2ch") pod "e6036791-b766-491c-807f-cdb3f9616288" (UID: "e6036791-b766-491c-807f-cdb3f9616288"). InnerVolumeSpecName "kube-api-access-vf2ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.043278 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e6036791-b766-491c-807f-cdb3f9616288" (UID: "e6036791-b766-491c-807f-cdb3f9616288"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.049062 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-config" (OuterVolumeSpecName: "config") pod "e6036791-b766-491c-807f-cdb3f9616288" (UID: "e6036791-b766-491c-807f-cdb3f9616288"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.051126 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6036791-b766-491c-807f-cdb3f9616288" (UID: "e6036791-b766-491c-807f-cdb3f9616288"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.052614 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e6036791-b766-491c-807f-cdb3f9616288" (UID: "e6036791-b766-491c-807f-cdb3f9616288"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.108341 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.108384 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf2ch\" (UniqueName: \"kubernetes.io/projected/e6036791-b766-491c-807f-cdb3f9616288-kube-api-access-vf2ch\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.108398 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.108410 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.108422 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6036791-b766-491c-807f-cdb3f9616288-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.789652 4594 generic.go:334] "Generic (PLEG): container finished" podID="1ff736d8-8719-402e-95c9-1d790c1dff5e" containerID="3d435468534c16e70f94ac0b2adfa267064e52b83844f59cf3f55c599965f5f9" exitCode=0 Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.789734 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1ff736d8-8719-402e-95c9-1d790c1dff5e","Type":"ContainerDied","Data":"3d435468534c16e70f94ac0b2adfa267064e52b83844f59cf3f55c599965f5f9"} Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.794267 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" event={"ID":"e6036791-b766-491c-807f-cdb3f9616288","Type":"ContainerDied","Data":"6af2187caa36a4b9ff3b01b2a3535ace323f7cfc86ca9fe73d3f926329c276e6"} Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.794342 4594 scope.go:117] "RemoveContainer" containerID="666b7102786b97c74f0b151f2c001e0f396afe8041aa9ab8a907a15084474eb0" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.794501 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f9c4c8bc-hpsj2" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.898204 4594 scope.go:117] "RemoveContainer" containerID="a435ac8d473ab7df525e6f724350839ba52ac1cb6a472b0f52c11182468a6985" Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.929866 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-hpsj2"] Nov 29 05:43:51 crc kubenswrapper[4594]: I1129 05:43:51.937618 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-hpsj2"] Nov 29 05:43:52 crc kubenswrapper[4594]: I1129 05:43:52.092341 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6036791-b766-491c-807f-cdb3f9616288" path="/var/lib/kubelet/pods/e6036791-b766-491c-807f-cdb3f9616288/volumes" Nov 29 05:43:52 crc kubenswrapper[4594]: I1129 05:43:52.811483 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1ff736d8-8719-402e-95c9-1d790c1dff5e","Type":"ContainerStarted","Data":"3a62ad72db403bb720849a88b9b2000a5a92e83e4cbb2f26b9b5510384634882"} Nov 29 05:43:52 crc kubenswrapper[4594]: I1129 05:43:52.812513 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 29 05:43:52 crc kubenswrapper[4594]: I1129 05:43:52.830808 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371940.023981 podStartE2EDuration="1m36.830794229s" podCreationTimestamp="2025-11-29 05:42:16 +0000 UTC" firstStartedPulling="2025-11-29 05:42:18.641879105 +0000 UTC m=+862.882388315" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:43:52.828330802 +0000 UTC m=+957.068840022" watchObservedRunningTime="2025-11-29 05:43:52.830794229 +0000 UTC m=+957.071303448" Nov 29 05:43:52 crc kubenswrapper[4594]: I1129 05:43:52.871787 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-b9h9k" Nov 29 05:43:58 crc kubenswrapper[4594]: I1129 05:43:58.767313 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:43:58 crc kubenswrapper[4594]: I1129 05:43:58.785475 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.868129 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-6m48g"] Nov 29 05:43:59 crc kubenswrapper[4594]: E1129 05:43:59.868882 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a" containerName="ovn-config" Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.868902 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a" containerName="ovn-config" Nov 29 05:43:59 crc kubenswrapper[4594]: E1129 05:43:59.868921 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6036791-b766-491c-807f-cdb3f9616288" containerName="dnsmasq-dns" Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.868928 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6036791-b766-491c-807f-cdb3f9616288" containerName="dnsmasq-dns" Nov 29 05:43:59 crc kubenswrapper[4594]: E1129 05:43:59.868948 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6036791-b766-491c-807f-cdb3f9616288" containerName="init" Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.868955 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6036791-b766-491c-807f-cdb3f9616288" containerName="init" Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.869144 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6a00bf1-daa5-444e-9a21-be8ac0bb8c4a" containerName="ovn-config" Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.869169 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6036791-b766-491c-807f-cdb3f9616288" containerName="dnsmasq-dns" Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.869893 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6m48g" Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.887221 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6m48g"] Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.960814 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3c5d-account-create-update-kw8mm"] Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.962016 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3c5d-account-create-update-kw8mm" Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.968350 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3c5d-account-create-update-kw8mm"] Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.968942 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.972030 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d42jb\" (UniqueName: \"kubernetes.io/projected/7a14827c-5222-4670-bd36-2d0274d1e93e-kube-api-access-d42jb\") pod \"glance-db-create-6m48g\" (UID: \"7a14827c-5222-4670-bd36-2d0274d1e93e\") " pod="openstack/glance-db-create-6m48g" Nov 29 05:43:59 crc kubenswrapper[4594]: I1129 05:43:59.972201 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a14827c-5222-4670-bd36-2d0274d1e93e-operator-scripts\") pod \"glance-db-create-6m48g\" (UID: \"7a14827c-5222-4670-bd36-2d0274d1e93e\") " pod="openstack/glance-db-create-6m48g" Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.074496 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a14827c-5222-4670-bd36-2d0274d1e93e-operator-scripts\") pod \"glance-db-create-6m48g\" (UID: \"7a14827c-5222-4670-bd36-2d0274d1e93e\") " pod="openstack/glance-db-create-6m48g" Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.074575 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zltw4\" (UniqueName: \"kubernetes.io/projected/5c601f48-3079-4e66-8298-65df2d8b111d-kube-api-access-zltw4\") pod \"glance-3c5d-account-create-update-kw8mm\" (UID: \"5c601f48-3079-4e66-8298-65df2d8b111d\") " pod="openstack/glance-3c5d-account-create-update-kw8mm" Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.074665 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d42jb\" (UniqueName: \"kubernetes.io/projected/7a14827c-5222-4670-bd36-2d0274d1e93e-kube-api-access-d42jb\") pod \"glance-db-create-6m48g\" (UID: \"7a14827c-5222-4670-bd36-2d0274d1e93e\") " pod="openstack/glance-db-create-6m48g" Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.074932 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c601f48-3079-4e66-8298-65df2d8b111d-operator-scripts\") pod \"glance-3c5d-account-create-update-kw8mm\" (UID: \"5c601f48-3079-4e66-8298-65df2d8b111d\") " pod="openstack/glance-3c5d-account-create-update-kw8mm" Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.075355 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a14827c-5222-4670-bd36-2d0274d1e93e-operator-scripts\") pod \"glance-db-create-6m48g\" (UID: \"7a14827c-5222-4670-bd36-2d0274d1e93e\") " pod="openstack/glance-db-create-6m48g" Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.098411 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d42jb\" (UniqueName: \"kubernetes.io/projected/7a14827c-5222-4670-bd36-2d0274d1e93e-kube-api-access-d42jb\") pod \"glance-db-create-6m48g\" (UID: \"7a14827c-5222-4670-bd36-2d0274d1e93e\") " pod="openstack/glance-db-create-6m48g" Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.178300 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c601f48-3079-4e66-8298-65df2d8b111d-operator-scripts\") pod \"glance-3c5d-account-create-update-kw8mm\" (UID: \"5c601f48-3079-4e66-8298-65df2d8b111d\") " pod="openstack/glance-3c5d-account-create-update-kw8mm" Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.178456 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltw4\" (UniqueName: \"kubernetes.io/projected/5c601f48-3079-4e66-8298-65df2d8b111d-kube-api-access-zltw4\") pod \"glance-3c5d-account-create-update-kw8mm\" (UID: \"5c601f48-3079-4e66-8298-65df2d8b111d\") " pod="openstack/glance-3c5d-account-create-update-kw8mm" Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.179299 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c601f48-3079-4e66-8298-65df2d8b111d-operator-scripts\") pod \"glance-3c5d-account-create-update-kw8mm\" (UID: \"5c601f48-3079-4e66-8298-65df2d8b111d\") " pod="openstack/glance-3c5d-account-create-update-kw8mm" Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.192863 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6m48g" Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.195084 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zltw4\" (UniqueName: \"kubernetes.io/projected/5c601f48-3079-4e66-8298-65df2d8b111d-kube-api-access-zltw4\") pod \"glance-3c5d-account-create-update-kw8mm\" (UID: \"5c601f48-3079-4e66-8298-65df2d8b111d\") " pod="openstack/glance-3c5d-account-create-update-kw8mm" Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.277804 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3c5d-account-create-update-kw8mm" Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.662987 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6m48g"] Nov 29 05:44:00 crc kubenswrapper[4594]: W1129 05:44:00.667199 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a14827c_5222_4670_bd36_2d0274d1e93e.slice/crio-53ed4105b0a81707d851baa4e07fe4389c18bb3f4c2db6f907d806e49d62658d WatchSource:0}: Error finding container 53ed4105b0a81707d851baa4e07fe4389c18bb3f4c2db6f907d806e49d62658d: Status 404 returned error can't find the container with id 53ed4105b0a81707d851baa4e07fe4389c18bb3f4c2db6f907d806e49d62658d Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.795295 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3c5d-account-create-update-kw8mm"] Nov 29 05:44:00 crc kubenswrapper[4594]: W1129 05:44:00.795773 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c601f48_3079_4e66_8298_65df2d8b111d.slice/crio-76c193729d607f60e3c106fac5fbfead5a601942df077be1f5ef2819bee15995 WatchSource:0}: Error finding container 76c193729d607f60e3c106fac5fbfead5a601942df077be1f5ef2819bee15995: Status 404 returned error can't find the container with id 76c193729d607f60e3c106fac5fbfead5a601942df077be1f5ef2819bee15995 Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.890943 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6m48g" event={"ID":"7a14827c-5222-4670-bd36-2d0274d1e93e","Type":"ContainerStarted","Data":"b6af7a25841c353e99beb65fa3333280ff18afcee460c1d7fda0fb4e69b7bc12"} Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.891007 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6m48g" event={"ID":"7a14827c-5222-4670-bd36-2d0274d1e93e","Type":"ContainerStarted","Data":"53ed4105b0a81707d851baa4e07fe4389c18bb3f4c2db6f907d806e49d62658d"} Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.893206 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3c5d-account-create-update-kw8mm" event={"ID":"5c601f48-3079-4e66-8298-65df2d8b111d","Type":"ContainerStarted","Data":"76c193729d607f60e3c106fac5fbfead5a601942df077be1f5ef2819bee15995"} Nov 29 05:44:00 crc kubenswrapper[4594]: I1129 05:44:00.918817 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-6m48g" podStartSLOduration=1.918801862 podStartE2EDuration="1.918801862s" podCreationTimestamp="2025-11-29 05:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:44:00.913430984 +0000 UTC m=+965.153940194" watchObservedRunningTime="2025-11-29 05:44:00.918801862 +0000 UTC m=+965.159311082" Nov 29 05:44:01 crc kubenswrapper[4594]: I1129 05:44:01.904489 4594 generic.go:334] "Generic (PLEG): container finished" podID="7a14827c-5222-4670-bd36-2d0274d1e93e" containerID="b6af7a25841c353e99beb65fa3333280ff18afcee460c1d7fda0fb4e69b7bc12" exitCode=0 Nov 29 05:44:01 crc kubenswrapper[4594]: I1129 05:44:01.904601 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6m48g" event={"ID":"7a14827c-5222-4670-bd36-2d0274d1e93e","Type":"ContainerDied","Data":"b6af7a25841c353e99beb65fa3333280ff18afcee460c1d7fda0fb4e69b7bc12"} Nov 29 05:44:01 crc kubenswrapper[4594]: I1129 05:44:01.907672 4594 generic.go:334] "Generic (PLEG): container finished" podID="5c601f48-3079-4e66-8298-65df2d8b111d" containerID="eb33a743456d3042e4c9203629fa25282cf037e46fb47fbac94737b25b77d678" exitCode=0 Nov 29 05:44:01 crc kubenswrapper[4594]: I1129 05:44:01.907741 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3c5d-account-create-update-kw8mm" event={"ID":"5c601f48-3079-4e66-8298-65df2d8b111d","Type":"ContainerDied","Data":"eb33a743456d3042e4c9203629fa25282cf037e46fb47fbac94737b25b77d678"} Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.287847 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3c5d-account-create-update-kw8mm" Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.291283 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6m48g" Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.341445 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zltw4\" (UniqueName: \"kubernetes.io/projected/5c601f48-3079-4e66-8298-65df2d8b111d-kube-api-access-zltw4\") pod \"5c601f48-3079-4e66-8298-65df2d8b111d\" (UID: \"5c601f48-3079-4e66-8298-65df2d8b111d\") " Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.341492 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d42jb\" (UniqueName: \"kubernetes.io/projected/7a14827c-5222-4670-bd36-2d0274d1e93e-kube-api-access-d42jb\") pod \"7a14827c-5222-4670-bd36-2d0274d1e93e\" (UID: \"7a14827c-5222-4670-bd36-2d0274d1e93e\") " Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.341524 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c601f48-3079-4e66-8298-65df2d8b111d-operator-scripts\") pod \"5c601f48-3079-4e66-8298-65df2d8b111d\" (UID: \"5c601f48-3079-4e66-8298-65df2d8b111d\") " Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.341547 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a14827c-5222-4670-bd36-2d0274d1e93e-operator-scripts\") pod \"7a14827c-5222-4670-bd36-2d0274d1e93e\" (UID: \"7a14827c-5222-4670-bd36-2d0274d1e93e\") " Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.342955 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c601f48-3079-4e66-8298-65df2d8b111d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c601f48-3079-4e66-8298-65df2d8b111d" (UID: "5c601f48-3079-4e66-8298-65df2d8b111d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.343035 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a14827c-5222-4670-bd36-2d0274d1e93e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a14827c-5222-4670-bd36-2d0274d1e93e" (UID: "7a14827c-5222-4670-bd36-2d0274d1e93e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.348600 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c601f48-3079-4e66-8298-65df2d8b111d-kube-api-access-zltw4" (OuterVolumeSpecName: "kube-api-access-zltw4") pod "5c601f48-3079-4e66-8298-65df2d8b111d" (UID: "5c601f48-3079-4e66-8298-65df2d8b111d"). InnerVolumeSpecName "kube-api-access-zltw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.348802 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a14827c-5222-4670-bd36-2d0274d1e93e-kube-api-access-d42jb" (OuterVolumeSpecName: "kube-api-access-d42jb") pod "7a14827c-5222-4670-bd36-2d0274d1e93e" (UID: "7a14827c-5222-4670-bd36-2d0274d1e93e"). InnerVolumeSpecName "kube-api-access-d42jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.444887 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zltw4\" (UniqueName: \"kubernetes.io/projected/5c601f48-3079-4e66-8298-65df2d8b111d-kube-api-access-zltw4\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.444926 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d42jb\" (UniqueName: \"kubernetes.io/projected/7a14827c-5222-4670-bd36-2d0274d1e93e-kube-api-access-d42jb\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.444940 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c601f48-3079-4e66-8298-65df2d8b111d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.444953 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a14827c-5222-4670-bd36-2d0274d1e93e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.927747 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6m48g" Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.927742 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6m48g" event={"ID":"7a14827c-5222-4670-bd36-2d0274d1e93e","Type":"ContainerDied","Data":"53ed4105b0a81707d851baa4e07fe4389c18bb3f4c2db6f907d806e49d62658d"} Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.927945 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53ed4105b0a81707d851baa4e07fe4389c18bb3f4c2db6f907d806e49d62658d" Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.930485 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3c5d-account-create-update-kw8mm" event={"ID":"5c601f48-3079-4e66-8298-65df2d8b111d","Type":"ContainerDied","Data":"76c193729d607f60e3c106fac5fbfead5a601942df077be1f5ef2819bee15995"} Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.930540 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76c193729d607f60e3c106fac5fbfead5a601942df077be1f5ef2819bee15995" Nov 29 05:44:03 crc kubenswrapper[4594]: I1129 05:44:03.930556 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3c5d-account-create-update-kw8mm" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.086173 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ww7dv"] Nov 29 05:44:05 crc kubenswrapper[4594]: E1129 05:44:05.086828 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c601f48-3079-4e66-8298-65df2d8b111d" containerName="mariadb-account-create-update" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.086845 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c601f48-3079-4e66-8298-65df2d8b111d" containerName="mariadb-account-create-update" Nov 29 05:44:05 crc kubenswrapper[4594]: E1129 05:44:05.086860 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a14827c-5222-4670-bd36-2d0274d1e93e" containerName="mariadb-database-create" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.086866 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a14827c-5222-4670-bd36-2d0274d1e93e" containerName="mariadb-database-create" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.087060 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a14827c-5222-4670-bd36-2d0274d1e93e" containerName="mariadb-database-create" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.087075 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c601f48-3079-4e66-8298-65df2d8b111d" containerName="mariadb-account-create-update" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.087731 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.089596 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rtrkc" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.090807 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.096043 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ww7dv"] Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.174897 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf245\" (UniqueName: \"kubernetes.io/projected/98e4a7be-f230-4732-9a53-d258bf31954b-kube-api-access-hf245\") pod \"glance-db-sync-ww7dv\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.174947 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-combined-ca-bundle\") pod \"glance-db-sync-ww7dv\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.175340 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-db-sync-config-data\") pod \"glance-db-sync-ww7dv\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.175384 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-config-data\") pod \"glance-db-sync-ww7dv\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.277354 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-combined-ca-bundle\") pod \"glance-db-sync-ww7dv\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.277598 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-db-sync-config-data\") pod \"glance-db-sync-ww7dv\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.277664 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-config-data\") pod \"glance-db-sync-ww7dv\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.277824 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf245\" (UniqueName: \"kubernetes.io/projected/98e4a7be-f230-4732-9a53-d258bf31954b-kube-api-access-hf245\") pod \"glance-db-sync-ww7dv\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.283985 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-combined-ca-bundle\") pod \"glance-db-sync-ww7dv\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.285407 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-config-data\") pod \"glance-db-sync-ww7dv\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.287109 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-db-sync-config-data\") pod \"glance-db-sync-ww7dv\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.292682 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf245\" (UniqueName: \"kubernetes.io/projected/98e4a7be-f230-4732-9a53-d258bf31954b-kube-api-access-hf245\") pod \"glance-db-sync-ww7dv\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.405485 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:05 crc kubenswrapper[4594]: I1129 05:44:05.997623 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ww7dv"] Nov 29 05:44:06 crc kubenswrapper[4594]: W1129 05:44:06.000450 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e4a7be_f230_4732_9a53_d258bf31954b.slice/crio-d3a2dd92d62aa3975946694c53c568cfbcdbffc5583766adc8039a73c99aef28 WatchSource:0}: Error finding container d3a2dd92d62aa3975946694c53c568cfbcdbffc5583766adc8039a73c99aef28: Status 404 returned error can't find the container with id d3a2dd92d62aa3975946694c53c568cfbcdbffc5583766adc8039a73c99aef28 Nov 29 05:44:06 crc kubenswrapper[4594]: I1129 05:44:06.961228 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ww7dv" event={"ID":"98e4a7be-f230-4732-9a53-d258bf31954b","Type":"ContainerStarted","Data":"d3a2dd92d62aa3975946694c53c568cfbcdbffc5583766adc8039a73c99aef28"} Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.232472 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.543886 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-msskz"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.545186 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-msskz" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.558297 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-msskz"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.646142 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/111c210f-a9a4-46ea-93ce-e91bae7ef7a9-operator-scripts\") pod \"barbican-db-create-msskz\" (UID: \"111c210f-a9a4-46ea-93ce-e91bae7ef7a9\") " pod="openstack/barbican-db-create-msskz" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.646241 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2wg8\" (UniqueName: \"kubernetes.io/projected/111c210f-a9a4-46ea-93ce-e91bae7ef7a9-kube-api-access-d2wg8\") pod \"barbican-db-create-msskz\" (UID: \"111c210f-a9a4-46ea-93ce-e91bae7ef7a9\") " pod="openstack/barbican-db-create-msskz" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.650377 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d3a2-account-create-update-qmq8q"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.651452 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d3a2-account-create-update-qmq8q" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.659796 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-5lm4l"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.661099 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5lm4l" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.667732 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.677838 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d3a2-account-create-update-qmq8q"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.686356 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5lm4l"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.707828 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-bg92f"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.722825 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.726291 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.731048 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-xzztn" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.746057 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-bg92f"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.751594 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a4a0787-c2fa-43c0-9c7e-98cbdb79134b-operator-scripts\") pod \"cinder-db-create-5lm4l\" (UID: \"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b\") " pod="openstack/cinder-db-create-5lm4l" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.751645 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2wg8\" (UniqueName: \"kubernetes.io/projected/111c210f-a9a4-46ea-93ce-e91bae7ef7a9-kube-api-access-d2wg8\") pod \"barbican-db-create-msskz\" (UID: \"111c210f-a9a4-46ea-93ce-e91bae7ef7a9\") " pod="openstack/barbican-db-create-msskz" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.751688 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt5wc\" (UniqueName: \"kubernetes.io/projected/11684277-b949-4e98-8e76-e0780cfb4d32-kube-api-access-xt5wc\") pod \"barbican-d3a2-account-create-update-qmq8q\" (UID: \"11684277-b949-4e98-8e76-e0780cfb4d32\") " pod="openstack/barbican-d3a2-account-create-update-qmq8q" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.751719 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw2j2\" (UniqueName: \"kubernetes.io/projected/7474ba72-1534-4193-9590-e46dfc840403-kube-api-access-jw2j2\") pod \"watcher-db-sync-bg92f\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.751748 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-combined-ca-bundle\") pod \"watcher-db-sync-bg92f\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.751783 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-db-sync-config-data\") pod \"watcher-db-sync-bg92f\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.751840 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11684277-b949-4e98-8e76-e0780cfb4d32-operator-scripts\") pod \"barbican-d3a2-account-create-update-qmq8q\" (UID: \"11684277-b949-4e98-8e76-e0780cfb4d32\") " pod="openstack/barbican-d3a2-account-create-update-qmq8q" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.751899 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-config-data\") pod \"watcher-db-sync-bg92f\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.751921 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/111c210f-a9a4-46ea-93ce-e91bae7ef7a9-operator-scripts\") pod \"barbican-db-create-msskz\" (UID: \"111c210f-a9a4-46ea-93ce-e91bae7ef7a9\") " pod="openstack/barbican-db-create-msskz" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.751965 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8gjs\" (UniqueName: \"kubernetes.io/projected/8a4a0787-c2fa-43c0-9c7e-98cbdb79134b-kube-api-access-g8gjs\") pod \"cinder-db-create-5lm4l\" (UID: \"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b\") " pod="openstack/cinder-db-create-5lm4l" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.756804 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/111c210f-a9a4-46ea-93ce-e91bae7ef7a9-operator-scripts\") pod \"barbican-db-create-msskz\" (UID: \"111c210f-a9a4-46ea-93ce-e91bae7ef7a9\") " pod="openstack/barbican-db-create-msskz" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.787927 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2wg8\" (UniqueName: \"kubernetes.io/projected/111c210f-a9a4-46ea-93ce-e91bae7ef7a9-kube-api-access-d2wg8\") pod \"barbican-db-create-msskz\" (UID: \"111c210f-a9a4-46ea-93ce-e91bae7ef7a9\") " pod="openstack/barbican-db-create-msskz" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.829688 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-67zkn"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.831211 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-67zkn" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.833701 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.835998 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.836212 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.836387 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nz5kq" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.847431 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-67zkn"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.857656 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-db-sync-config-data\") pod \"watcher-db-sync-bg92f\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.857711 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2382e225-37e5-4fa9-97df-0d581b97dd01-config-data\") pod \"keystone-db-sync-67zkn\" (UID: \"2382e225-37e5-4fa9-97df-0d581b97dd01\") " pod="openstack/keystone-db-sync-67zkn" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.857746 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11684277-b949-4e98-8e76-e0780cfb4d32-operator-scripts\") pod \"barbican-d3a2-account-create-update-qmq8q\" (UID: \"11684277-b949-4e98-8e76-e0780cfb4d32\") " pod="openstack/barbican-d3a2-account-create-update-qmq8q" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.857792 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-config-data\") pod \"watcher-db-sync-bg92f\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.857825 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8gjs\" (UniqueName: \"kubernetes.io/projected/8a4a0787-c2fa-43c0-9c7e-98cbdb79134b-kube-api-access-g8gjs\") pod \"cinder-db-create-5lm4l\" (UID: \"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b\") " pod="openstack/cinder-db-create-5lm4l" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.857857 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a4a0787-c2fa-43c0-9c7e-98cbdb79134b-operator-scripts\") pod \"cinder-db-create-5lm4l\" (UID: \"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b\") " pod="openstack/cinder-db-create-5lm4l" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.857894 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78qks\" (UniqueName: \"kubernetes.io/projected/2382e225-37e5-4fa9-97df-0d581b97dd01-kube-api-access-78qks\") pod \"keystone-db-sync-67zkn\" (UID: \"2382e225-37e5-4fa9-97df-0d581b97dd01\") " pod="openstack/keystone-db-sync-67zkn" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.857912 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt5wc\" (UniqueName: \"kubernetes.io/projected/11684277-b949-4e98-8e76-e0780cfb4d32-kube-api-access-xt5wc\") pod \"barbican-d3a2-account-create-update-qmq8q\" (UID: \"11684277-b949-4e98-8e76-e0780cfb4d32\") " pod="openstack/barbican-d3a2-account-create-update-qmq8q" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.857937 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw2j2\" (UniqueName: \"kubernetes.io/projected/7474ba72-1534-4193-9590-e46dfc840403-kube-api-access-jw2j2\") pod \"watcher-db-sync-bg92f\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.857953 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2382e225-37e5-4fa9-97df-0d581b97dd01-combined-ca-bundle\") pod \"keystone-db-sync-67zkn\" (UID: \"2382e225-37e5-4fa9-97df-0d581b97dd01\") " pod="openstack/keystone-db-sync-67zkn" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.857974 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-combined-ca-bundle\") pod \"watcher-db-sync-bg92f\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.859977 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a4a0787-c2fa-43c0-9c7e-98cbdb79134b-operator-scripts\") pod \"cinder-db-create-5lm4l\" (UID: \"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b\") " pod="openstack/cinder-db-create-5lm4l" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.861020 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11684277-b949-4e98-8e76-e0780cfb4d32-operator-scripts\") pod \"barbican-d3a2-account-create-update-qmq8q\" (UID: \"11684277-b949-4e98-8e76-e0780cfb4d32\") " pod="openstack/barbican-d3a2-account-create-update-qmq8q" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.861541 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-combined-ca-bundle\") pod \"watcher-db-sync-bg92f\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.864205 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-db-sync-config-data\") pod \"watcher-db-sync-bg92f\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.867784 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-config-data\") pod \"watcher-db-sync-bg92f\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.869094 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-msskz" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.874909 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b5e5-account-create-update-tqj9s"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.876380 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5e5-account-create-update-tqj9s" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.880851 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.881204 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8gjs\" (UniqueName: \"kubernetes.io/projected/8a4a0787-c2fa-43c0-9c7e-98cbdb79134b-kube-api-access-g8gjs\") pod \"cinder-db-create-5lm4l\" (UID: \"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b\") " pod="openstack/cinder-db-create-5lm4l" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.881314 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b5e5-account-create-update-tqj9s"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.887939 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw2j2\" (UniqueName: \"kubernetes.io/projected/7474ba72-1534-4193-9590-e46dfc840403-kube-api-access-jw2j2\") pod \"watcher-db-sync-bg92f\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.888586 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt5wc\" (UniqueName: \"kubernetes.io/projected/11684277-b949-4e98-8e76-e0780cfb4d32-kube-api-access-xt5wc\") pod \"barbican-d3a2-account-create-update-qmq8q\" (UID: \"11684277-b949-4e98-8e76-e0780cfb4d32\") " pod="openstack/barbican-d3a2-account-create-update-qmq8q" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.956474 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2r9dc"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.958049 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2r9dc" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.969340 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2r9dc"] Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.970960 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtswl\" (UniqueName: \"kubernetes.io/projected/deb711bc-c899-4a09-b3f3-c1d256da7188-kube-api-access-rtswl\") pod \"cinder-b5e5-account-create-update-tqj9s\" (UID: \"deb711bc-c899-4a09-b3f3-c1d256da7188\") " pod="openstack/cinder-b5e5-account-create-update-tqj9s" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.971137 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78qks\" (UniqueName: \"kubernetes.io/projected/2382e225-37e5-4fa9-97df-0d581b97dd01-kube-api-access-78qks\") pod \"keystone-db-sync-67zkn\" (UID: \"2382e225-37e5-4fa9-97df-0d581b97dd01\") " pod="openstack/keystone-db-sync-67zkn" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.971205 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2382e225-37e5-4fa9-97df-0d581b97dd01-combined-ca-bundle\") pod \"keystone-db-sync-67zkn\" (UID: \"2382e225-37e5-4fa9-97df-0d581b97dd01\") " pod="openstack/keystone-db-sync-67zkn" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.971224 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deb711bc-c899-4a09-b3f3-c1d256da7188-operator-scripts\") pod \"cinder-b5e5-account-create-update-tqj9s\" (UID: \"deb711bc-c899-4a09-b3f3-c1d256da7188\") " pod="openstack/cinder-b5e5-account-create-update-tqj9s" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.971358 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2382e225-37e5-4fa9-97df-0d581b97dd01-config-data\") pod \"keystone-db-sync-67zkn\" (UID: \"2382e225-37e5-4fa9-97df-0d581b97dd01\") " pod="openstack/keystone-db-sync-67zkn" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.972204 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d3a2-account-create-update-qmq8q" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.979536 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5lm4l" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.985461 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2382e225-37e5-4fa9-97df-0d581b97dd01-config-data\") pod \"keystone-db-sync-67zkn\" (UID: \"2382e225-37e5-4fa9-97df-0d581b97dd01\") " pod="openstack/keystone-db-sync-67zkn" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.989776 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78qks\" (UniqueName: \"kubernetes.io/projected/2382e225-37e5-4fa9-97df-0d581b97dd01-kube-api-access-78qks\") pod \"keystone-db-sync-67zkn\" (UID: \"2382e225-37e5-4fa9-97df-0d581b97dd01\") " pod="openstack/keystone-db-sync-67zkn" Nov 29 05:44:08 crc kubenswrapper[4594]: I1129 05:44:08.993898 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2382e225-37e5-4fa9-97df-0d581b97dd01-combined-ca-bundle\") pod \"keystone-db-sync-67zkn\" (UID: \"2382e225-37e5-4fa9-97df-0d581b97dd01\") " pod="openstack/keystone-db-sync-67zkn" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.050529 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-bg92f" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.069419 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-90ef-account-create-update-lmtdj"] Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.070903 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-90ef-account-create-update-lmtdj" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.078646 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deb711bc-c899-4a09-b3f3-c1d256da7188-operator-scripts\") pod \"cinder-b5e5-account-create-update-tqj9s\" (UID: \"deb711bc-c899-4a09-b3f3-c1d256da7188\") " pod="openstack/cinder-b5e5-account-create-update-tqj9s" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.078727 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twzgf\" (UniqueName: \"kubernetes.io/projected/ee64b887-a184-43e5-85ad-df3e76919044-kube-api-access-twzgf\") pod \"neutron-db-create-2r9dc\" (UID: \"ee64b887-a184-43e5-85ad-df3e76919044\") " pod="openstack/neutron-db-create-2r9dc" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.079480 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.080546 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deb711bc-c899-4a09-b3f3-c1d256da7188-operator-scripts\") pod \"cinder-b5e5-account-create-update-tqj9s\" (UID: \"deb711bc-c899-4a09-b3f3-c1d256da7188\") " pod="openstack/cinder-b5e5-account-create-update-tqj9s" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.083394 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee64b887-a184-43e5-85ad-df3e76919044-operator-scripts\") pod \"neutron-db-create-2r9dc\" (UID: \"ee64b887-a184-43e5-85ad-df3e76919044\") " pod="openstack/neutron-db-create-2r9dc" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.083441 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtswl\" (UniqueName: \"kubernetes.io/projected/deb711bc-c899-4a09-b3f3-c1d256da7188-kube-api-access-rtswl\") pod \"cinder-b5e5-account-create-update-tqj9s\" (UID: \"deb711bc-c899-4a09-b3f3-c1d256da7188\") " pod="openstack/cinder-b5e5-account-create-update-tqj9s" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.095511 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-90ef-account-create-update-lmtdj"] Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.106476 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtswl\" (UniqueName: \"kubernetes.io/projected/deb711bc-c899-4a09-b3f3-c1d256da7188-kube-api-access-rtswl\") pod \"cinder-b5e5-account-create-update-tqj9s\" (UID: \"deb711bc-c899-4a09-b3f3-c1d256da7188\") " pod="openstack/cinder-b5e5-account-create-update-tqj9s" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.150751 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-67zkn" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.186873 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6jvp\" (UniqueName: \"kubernetes.io/projected/2e1365bf-4433-4641-910f-0d7a92be4fc1-kube-api-access-x6jvp\") pod \"neutron-90ef-account-create-update-lmtdj\" (UID: \"2e1365bf-4433-4641-910f-0d7a92be4fc1\") " pod="openstack/neutron-90ef-account-create-update-lmtdj" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.187046 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twzgf\" (UniqueName: \"kubernetes.io/projected/ee64b887-a184-43e5-85ad-df3e76919044-kube-api-access-twzgf\") pod \"neutron-db-create-2r9dc\" (UID: \"ee64b887-a184-43e5-85ad-df3e76919044\") " pod="openstack/neutron-db-create-2r9dc" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.187334 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e1365bf-4433-4641-910f-0d7a92be4fc1-operator-scripts\") pod \"neutron-90ef-account-create-update-lmtdj\" (UID: \"2e1365bf-4433-4641-910f-0d7a92be4fc1\") " pod="openstack/neutron-90ef-account-create-update-lmtdj" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.187367 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee64b887-a184-43e5-85ad-df3e76919044-operator-scripts\") pod \"neutron-db-create-2r9dc\" (UID: \"ee64b887-a184-43e5-85ad-df3e76919044\") " pod="openstack/neutron-db-create-2r9dc" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.188138 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee64b887-a184-43e5-85ad-df3e76919044-operator-scripts\") pod \"neutron-db-create-2r9dc\" (UID: \"ee64b887-a184-43e5-85ad-df3e76919044\") " pod="openstack/neutron-db-create-2r9dc" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.208894 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twzgf\" (UniqueName: \"kubernetes.io/projected/ee64b887-a184-43e5-85ad-df3e76919044-kube-api-access-twzgf\") pod \"neutron-db-create-2r9dc\" (UID: \"ee64b887-a184-43e5-85ad-df3e76919044\") " pod="openstack/neutron-db-create-2r9dc" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.259588 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5e5-account-create-update-tqj9s" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.271089 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2r9dc" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.289375 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e1365bf-4433-4641-910f-0d7a92be4fc1-operator-scripts\") pod \"neutron-90ef-account-create-update-lmtdj\" (UID: \"2e1365bf-4433-4641-910f-0d7a92be4fc1\") " pod="openstack/neutron-90ef-account-create-update-lmtdj" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.289444 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6jvp\" (UniqueName: \"kubernetes.io/projected/2e1365bf-4433-4641-910f-0d7a92be4fc1-kube-api-access-x6jvp\") pod \"neutron-90ef-account-create-update-lmtdj\" (UID: \"2e1365bf-4433-4641-910f-0d7a92be4fc1\") " pod="openstack/neutron-90ef-account-create-update-lmtdj" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.290657 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e1365bf-4433-4641-910f-0d7a92be4fc1-operator-scripts\") pod \"neutron-90ef-account-create-update-lmtdj\" (UID: \"2e1365bf-4433-4641-910f-0d7a92be4fc1\") " pod="openstack/neutron-90ef-account-create-update-lmtdj" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.304543 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6jvp\" (UniqueName: \"kubernetes.io/projected/2e1365bf-4433-4641-910f-0d7a92be4fc1-kube-api-access-x6jvp\") pod \"neutron-90ef-account-create-update-lmtdj\" (UID: \"2e1365bf-4433-4641-910f-0d7a92be4fc1\") " pod="openstack/neutron-90ef-account-create-update-lmtdj" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.403948 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-90ef-account-create-update-lmtdj" Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.409133 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-msskz"] Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.582320 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d3a2-account-create-update-qmq8q"] Nov 29 05:44:09 crc kubenswrapper[4594]: W1129 05:44:09.583947 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11684277_b949_4e98_8e76_e0780cfb4d32.slice/crio-f387b0cf1a2257fb8abc7ea52b9cae9f7ae3d8659ee54c15d503be3d632a05ab WatchSource:0}: Error finding container f387b0cf1a2257fb8abc7ea52b9cae9f7ae3d8659ee54c15d503be3d632a05ab: Status 404 returned error can't find the container with id f387b0cf1a2257fb8abc7ea52b9cae9f7ae3d8659ee54c15d503be3d632a05ab Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.591684 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5lm4l"] Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.616109 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-bg92f"] Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.800121 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-67zkn"] Nov 29 05:44:09 crc kubenswrapper[4594]: W1129 05:44:09.827950 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2382e225_37e5_4fa9_97df_0d581b97dd01.slice/crio-7ca7f5ee5a5c1c516870a54b4562e5fd2911e8d91ac9e87c7342c35717e57447 WatchSource:0}: Error finding container 7ca7f5ee5a5c1c516870a54b4562e5fd2911e8d91ac9e87c7342c35717e57447: Status 404 returned error can't find the container with id 7ca7f5ee5a5c1c516870a54b4562e5fd2911e8d91ac9e87c7342c35717e57447 Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.891869 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b5e5-account-create-update-tqj9s"] Nov 29 05:44:09 crc kubenswrapper[4594]: I1129 05:44:09.901599 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2r9dc"] Nov 29 05:44:09 crc kubenswrapper[4594]: W1129 05:44:09.961540 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeb711bc_c899_4a09_b3f3_c1d256da7188.slice/crio-2c02e3209feebb3a5b57d744091e61856b5140bf956b85dd484eabcb576c3fc9 WatchSource:0}: Error finding container 2c02e3209feebb3a5b57d744091e61856b5140bf956b85dd484eabcb576c3fc9: Status 404 returned error can't find the container with id 2c02e3209feebb3a5b57d744091e61856b5140bf956b85dd484eabcb576c3fc9 Nov 29 05:44:09 crc kubenswrapper[4594]: W1129 05:44:09.961883 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee64b887_a184_43e5_85ad_df3e76919044.slice/crio-b60ddeef2574e1646ecfb9708b1133ed95aece3e64a9741a41558f7701d29df0 WatchSource:0}: Error finding container b60ddeef2574e1646ecfb9708b1133ed95aece3e64a9741a41558f7701d29df0: Status 404 returned error can't find the container with id b60ddeef2574e1646ecfb9708b1133ed95aece3e64a9741a41558f7701d29df0 Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.013230 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d3a2-account-create-update-qmq8q" event={"ID":"11684277-b949-4e98-8e76-e0780cfb4d32","Type":"ContainerStarted","Data":"8f5ad23e2667ade8b9fce594fbe7e024ea939d2657932f8a218eb38cb301ee46"} Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.013301 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d3a2-account-create-update-qmq8q" event={"ID":"11684277-b949-4e98-8e76-e0780cfb4d32","Type":"ContainerStarted","Data":"f387b0cf1a2257fb8abc7ea52b9cae9f7ae3d8659ee54c15d503be3d632a05ab"} Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.018140 4594 generic.go:334] "Generic (PLEG): container finished" podID="111c210f-a9a4-46ea-93ce-e91bae7ef7a9" containerID="07b381f8ed848bd7bd94f032034b8bfeafd332fe758a3e846ed44b735092d57c" exitCode=0 Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.018190 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-msskz" event={"ID":"111c210f-a9a4-46ea-93ce-e91bae7ef7a9","Type":"ContainerDied","Data":"07b381f8ed848bd7bd94f032034b8bfeafd332fe758a3e846ed44b735092d57c"} Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.018209 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-msskz" event={"ID":"111c210f-a9a4-46ea-93ce-e91bae7ef7a9","Type":"ContainerStarted","Data":"12d3cacc668b0e014690df7f67f3adad6e5755f78be3f4e9571250817a6122d8"} Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.019379 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-bg92f" event={"ID":"7474ba72-1534-4193-9590-e46dfc840403","Type":"ContainerStarted","Data":"f90d6b0a8b94b2068be08d2eb73761e0cb0ecf7030b282151f887d69440e461f"} Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.020705 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5lm4l" event={"ID":"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b","Type":"ContainerStarted","Data":"d3464520115e406ceb8d1ec77ff748787aba518899526e3b001a33e014afc141"} Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.020728 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5lm4l" event={"ID":"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b","Type":"ContainerStarted","Data":"d283b8efa021aefce82963db2da60e6c7fe5082996bdb937c8ab653d8d8ca00a"} Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.028243 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-67zkn" event={"ID":"2382e225-37e5-4fa9-97df-0d581b97dd01","Type":"ContainerStarted","Data":"7ca7f5ee5a5c1c516870a54b4562e5fd2911e8d91ac9e87c7342c35717e57447"} Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.029724 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2r9dc" event={"ID":"ee64b887-a184-43e5-85ad-df3e76919044","Type":"ContainerStarted","Data":"b60ddeef2574e1646ecfb9708b1133ed95aece3e64a9741a41558f7701d29df0"} Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.032356 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5e5-account-create-update-tqj9s" event={"ID":"deb711bc-c899-4a09-b3f3-c1d256da7188","Type":"ContainerStarted","Data":"2c02e3209feebb3a5b57d744091e61856b5140bf956b85dd484eabcb576c3fc9"} Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.050135 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-d3a2-account-create-update-qmq8q" podStartSLOduration=2.050121535 podStartE2EDuration="2.050121535s" podCreationTimestamp="2025-11-29 05:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:44:10.036470495 +0000 UTC m=+974.276979715" watchObservedRunningTime="2025-11-29 05:44:10.050121535 +0000 UTC m=+974.290630755" Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.063103 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-5lm4l" podStartSLOduration=2.063082018 podStartE2EDuration="2.063082018s" podCreationTimestamp="2025-11-29 05:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:44:10.055604024 +0000 UTC m=+974.296113244" watchObservedRunningTime="2025-11-29 05:44:10.063082018 +0000 UTC m=+974.303591238" Nov 29 05:44:10 crc kubenswrapper[4594]: I1129 05:44:10.073835 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-90ef-account-create-update-lmtdj"] Nov 29 05:44:10 crc kubenswrapper[4594]: W1129 05:44:10.075977 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e1365bf_4433_4641_910f_0d7a92be4fc1.slice/crio-3f3ba5593f376b85391b2f49b57447a5cde89300f0c9abdd071bf147cd209c39 WatchSource:0}: Error finding container 3f3ba5593f376b85391b2f49b57447a5cde89300f0c9abdd071bf147cd209c39: Status 404 returned error can't find the container with id 3f3ba5593f376b85391b2f49b57447a5cde89300f0c9abdd071bf147cd209c39 Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.050580 4594 generic.go:334] "Generic (PLEG): container finished" podID="deb711bc-c899-4a09-b3f3-c1d256da7188" containerID="bfeedd11925954273a192d98a96f9d06d46fddf7be21d81d23c71f509dca88ee" exitCode=0 Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.051007 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5e5-account-create-update-tqj9s" event={"ID":"deb711bc-c899-4a09-b3f3-c1d256da7188","Type":"ContainerDied","Data":"bfeedd11925954273a192d98a96f9d06d46fddf7be21d81d23c71f509dca88ee"} Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.054816 4594 generic.go:334] "Generic (PLEG): container finished" podID="11684277-b949-4e98-8e76-e0780cfb4d32" containerID="8f5ad23e2667ade8b9fce594fbe7e024ea939d2657932f8a218eb38cb301ee46" exitCode=0 Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.054882 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d3a2-account-create-update-qmq8q" event={"ID":"11684277-b949-4e98-8e76-e0780cfb4d32","Type":"ContainerDied","Data":"8f5ad23e2667ade8b9fce594fbe7e024ea939d2657932f8a218eb38cb301ee46"} Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.058222 4594 generic.go:334] "Generic (PLEG): container finished" podID="8a4a0787-c2fa-43c0-9c7e-98cbdb79134b" containerID="d3464520115e406ceb8d1ec77ff748787aba518899526e3b001a33e014afc141" exitCode=0 Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.058298 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5lm4l" event={"ID":"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b","Type":"ContainerDied","Data":"d3464520115e406ceb8d1ec77ff748787aba518899526e3b001a33e014afc141"} Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.060287 4594 generic.go:334] "Generic (PLEG): container finished" podID="2e1365bf-4433-4641-910f-0d7a92be4fc1" containerID="118b5f8d8df52bab944cb59fc5a9529b4e4bb11e7b7fbd286616aaf7d6fa5c2b" exitCode=0 Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.060336 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-90ef-account-create-update-lmtdj" event={"ID":"2e1365bf-4433-4641-910f-0d7a92be4fc1","Type":"ContainerDied","Data":"118b5f8d8df52bab944cb59fc5a9529b4e4bb11e7b7fbd286616aaf7d6fa5c2b"} Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.060355 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-90ef-account-create-update-lmtdj" event={"ID":"2e1365bf-4433-4641-910f-0d7a92be4fc1","Type":"ContainerStarted","Data":"3f3ba5593f376b85391b2f49b57447a5cde89300f0c9abdd071bf147cd209c39"} Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.067064 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2r9dc" event={"ID":"ee64b887-a184-43e5-85ad-df3e76919044","Type":"ContainerDied","Data":"785d96da8704d2a5416ed7f48a6dd28009ebc072530965f5458eec8b998ca2ce"} Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.067009 4594 generic.go:334] "Generic (PLEG): container finished" podID="ee64b887-a184-43e5-85ad-df3e76919044" containerID="785d96da8704d2a5416ed7f48a6dd28009ebc072530965f5458eec8b998ca2ce" exitCode=0 Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.429417 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-msskz" Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.574423 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/111c210f-a9a4-46ea-93ce-e91bae7ef7a9-operator-scripts\") pod \"111c210f-a9a4-46ea-93ce-e91bae7ef7a9\" (UID: \"111c210f-a9a4-46ea-93ce-e91bae7ef7a9\") " Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.574772 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2wg8\" (UniqueName: \"kubernetes.io/projected/111c210f-a9a4-46ea-93ce-e91bae7ef7a9-kube-api-access-d2wg8\") pod \"111c210f-a9a4-46ea-93ce-e91bae7ef7a9\" (UID: \"111c210f-a9a4-46ea-93ce-e91bae7ef7a9\") " Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.575558 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/111c210f-a9a4-46ea-93ce-e91bae7ef7a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "111c210f-a9a4-46ea-93ce-e91bae7ef7a9" (UID: "111c210f-a9a4-46ea-93ce-e91bae7ef7a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.581965 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/111c210f-a9a4-46ea-93ce-e91bae7ef7a9-kube-api-access-d2wg8" (OuterVolumeSpecName: "kube-api-access-d2wg8") pod "111c210f-a9a4-46ea-93ce-e91bae7ef7a9" (UID: "111c210f-a9a4-46ea-93ce-e91bae7ef7a9"). InnerVolumeSpecName "kube-api-access-d2wg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.676777 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/111c210f-a9a4-46ea-93ce-e91bae7ef7a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:11 crc kubenswrapper[4594]: I1129 05:44:11.676808 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2wg8\" (UniqueName: \"kubernetes.io/projected/111c210f-a9a4-46ea-93ce-e91bae7ef7a9-kube-api-access-d2wg8\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.082695 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-msskz" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.083620 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-msskz" event={"ID":"111c210f-a9a4-46ea-93ce-e91bae7ef7a9","Type":"ContainerDied","Data":"12d3cacc668b0e014690df7f67f3adad6e5755f78be3f4e9571250817a6122d8"} Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.083705 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12d3cacc668b0e014690df7f67f3adad6e5755f78be3f4e9571250817a6122d8" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.498490 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d3a2-account-create-update-qmq8q" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.600921 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11684277-b949-4e98-8e76-e0780cfb4d32-operator-scripts\") pod \"11684277-b949-4e98-8e76-e0780cfb4d32\" (UID: \"11684277-b949-4e98-8e76-e0780cfb4d32\") " Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.601403 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt5wc\" (UniqueName: \"kubernetes.io/projected/11684277-b949-4e98-8e76-e0780cfb4d32-kube-api-access-xt5wc\") pod \"11684277-b949-4e98-8e76-e0780cfb4d32\" (UID: \"11684277-b949-4e98-8e76-e0780cfb4d32\") " Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.602735 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11684277-b949-4e98-8e76-e0780cfb4d32-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11684277-b949-4e98-8e76-e0780cfb4d32" (UID: "11684277-b949-4e98-8e76-e0780cfb4d32"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.606836 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11684277-b949-4e98-8e76-e0780cfb4d32-kube-api-access-xt5wc" (OuterVolumeSpecName: "kube-api-access-xt5wc") pod "11684277-b949-4e98-8e76-e0780cfb4d32" (UID: "11684277-b949-4e98-8e76-e0780cfb4d32"). InnerVolumeSpecName "kube-api-access-xt5wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.655413 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2r9dc" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.671429 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5lm4l" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.676620 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-90ef-account-create-update-lmtdj" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.694786 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5e5-account-create-update-tqj9s" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.704013 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt5wc\" (UniqueName: \"kubernetes.io/projected/11684277-b949-4e98-8e76-e0780cfb4d32-kube-api-access-xt5wc\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.704097 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11684277-b949-4e98-8e76-e0780cfb4d32-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.805857 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6jvp\" (UniqueName: \"kubernetes.io/projected/2e1365bf-4433-4641-910f-0d7a92be4fc1-kube-api-access-x6jvp\") pod \"2e1365bf-4433-4641-910f-0d7a92be4fc1\" (UID: \"2e1365bf-4433-4641-910f-0d7a92be4fc1\") " Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.805932 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee64b887-a184-43e5-85ad-df3e76919044-operator-scripts\") pod \"ee64b887-a184-43e5-85ad-df3e76919044\" (UID: \"ee64b887-a184-43e5-85ad-df3e76919044\") " Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.805957 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twzgf\" (UniqueName: \"kubernetes.io/projected/ee64b887-a184-43e5-85ad-df3e76919044-kube-api-access-twzgf\") pod \"ee64b887-a184-43e5-85ad-df3e76919044\" (UID: \"ee64b887-a184-43e5-85ad-df3e76919044\") " Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.806024 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a4a0787-c2fa-43c0-9c7e-98cbdb79134b-operator-scripts\") pod \"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b\" (UID: \"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b\") " Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.806100 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e1365bf-4433-4641-910f-0d7a92be4fc1-operator-scripts\") pod \"2e1365bf-4433-4641-910f-0d7a92be4fc1\" (UID: \"2e1365bf-4433-4641-910f-0d7a92be4fc1\") " Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.806163 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8gjs\" (UniqueName: \"kubernetes.io/projected/8a4a0787-c2fa-43c0-9c7e-98cbdb79134b-kube-api-access-g8gjs\") pod \"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b\" (UID: \"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b\") " Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.806244 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deb711bc-c899-4a09-b3f3-c1d256da7188-operator-scripts\") pod \"deb711bc-c899-4a09-b3f3-c1d256da7188\" (UID: \"deb711bc-c899-4a09-b3f3-c1d256da7188\") " Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.806302 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtswl\" (UniqueName: \"kubernetes.io/projected/deb711bc-c899-4a09-b3f3-c1d256da7188-kube-api-access-rtswl\") pod \"deb711bc-c899-4a09-b3f3-c1d256da7188\" (UID: \"deb711bc-c899-4a09-b3f3-c1d256da7188\") " Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.807703 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a4a0787-c2fa-43c0-9c7e-98cbdb79134b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a4a0787-c2fa-43c0-9c7e-98cbdb79134b" (UID: "8a4a0787-c2fa-43c0-9c7e-98cbdb79134b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.807716 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee64b887-a184-43e5-85ad-df3e76919044-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee64b887-a184-43e5-85ad-df3e76919044" (UID: "ee64b887-a184-43e5-85ad-df3e76919044"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.808171 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb711bc-c899-4a09-b3f3-c1d256da7188-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "deb711bc-c899-4a09-b3f3-c1d256da7188" (UID: "deb711bc-c899-4a09-b3f3-c1d256da7188"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.808557 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e1365bf-4433-4641-910f-0d7a92be4fc1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e1365bf-4433-4641-910f-0d7a92be4fc1" (UID: "2e1365bf-4433-4641-910f-0d7a92be4fc1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.811032 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee64b887-a184-43e5-85ad-df3e76919044-kube-api-access-twzgf" (OuterVolumeSpecName: "kube-api-access-twzgf") pod "ee64b887-a184-43e5-85ad-df3e76919044" (UID: "ee64b887-a184-43e5-85ad-df3e76919044"). InnerVolumeSpecName "kube-api-access-twzgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.811462 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a4a0787-c2fa-43c0-9c7e-98cbdb79134b-kube-api-access-g8gjs" (OuterVolumeSpecName: "kube-api-access-g8gjs") pod "8a4a0787-c2fa-43c0-9c7e-98cbdb79134b" (UID: "8a4a0787-c2fa-43c0-9c7e-98cbdb79134b"). InnerVolumeSpecName "kube-api-access-g8gjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.815856 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb711bc-c899-4a09-b3f3-c1d256da7188-kube-api-access-rtswl" (OuterVolumeSpecName: "kube-api-access-rtswl") pod "deb711bc-c899-4a09-b3f3-c1d256da7188" (UID: "deb711bc-c899-4a09-b3f3-c1d256da7188"). InnerVolumeSpecName "kube-api-access-rtswl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.817548 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e1365bf-4433-4641-910f-0d7a92be4fc1-kube-api-access-x6jvp" (OuterVolumeSpecName: "kube-api-access-x6jvp") pod "2e1365bf-4433-4641-910f-0d7a92be4fc1" (UID: "2e1365bf-4433-4641-910f-0d7a92be4fc1"). InnerVolumeSpecName "kube-api-access-x6jvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.908886 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8gjs\" (UniqueName: \"kubernetes.io/projected/8a4a0787-c2fa-43c0-9c7e-98cbdb79134b-kube-api-access-g8gjs\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.908924 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deb711bc-c899-4a09-b3f3-c1d256da7188-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.908935 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtswl\" (UniqueName: \"kubernetes.io/projected/deb711bc-c899-4a09-b3f3-c1d256da7188-kube-api-access-rtswl\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.908946 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6jvp\" (UniqueName: \"kubernetes.io/projected/2e1365bf-4433-4641-910f-0d7a92be4fc1-kube-api-access-x6jvp\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.908954 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee64b887-a184-43e5-85ad-df3e76919044-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.908964 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twzgf\" (UniqueName: \"kubernetes.io/projected/ee64b887-a184-43e5-85ad-df3e76919044-kube-api-access-twzgf\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.908972 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a4a0787-c2fa-43c0-9c7e-98cbdb79134b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:12 crc kubenswrapper[4594]: I1129 05:44:12.908981 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e1365bf-4433-4641-910f-0d7a92be4fc1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.099316 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d3a2-account-create-update-qmq8q" event={"ID":"11684277-b949-4e98-8e76-e0780cfb4d32","Type":"ContainerDied","Data":"f387b0cf1a2257fb8abc7ea52b9cae9f7ae3d8659ee54c15d503be3d632a05ab"} Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.099360 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f387b0cf1a2257fb8abc7ea52b9cae9f7ae3d8659ee54c15d503be3d632a05ab" Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.099416 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d3a2-account-create-update-qmq8q" Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.102382 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5lm4l" event={"ID":"8a4a0787-c2fa-43c0-9c7e-98cbdb79134b","Type":"ContainerDied","Data":"d283b8efa021aefce82963db2da60e6c7fe5082996bdb937c8ab653d8d8ca00a"} Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.102417 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5lm4l" Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.102429 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d283b8efa021aefce82963db2da60e6c7fe5082996bdb937c8ab653d8d8ca00a" Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.104556 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-90ef-account-create-update-lmtdj" event={"ID":"2e1365bf-4433-4641-910f-0d7a92be4fc1","Type":"ContainerDied","Data":"3f3ba5593f376b85391b2f49b57447a5cde89300f0c9abdd071bf147cd209c39"} Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.104579 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f3ba5593f376b85391b2f49b57447a5cde89300f0c9abdd071bf147cd209c39" Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.104608 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-90ef-account-create-update-lmtdj" Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.106243 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2r9dc" Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.106232 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2r9dc" event={"ID":"ee64b887-a184-43e5-85ad-df3e76919044","Type":"ContainerDied","Data":"b60ddeef2574e1646ecfb9708b1133ed95aece3e64a9741a41558f7701d29df0"} Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.106395 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b60ddeef2574e1646ecfb9708b1133ed95aece3e64a9741a41558f7701d29df0" Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.107935 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b5e5-account-create-update-tqj9s" event={"ID":"deb711bc-c899-4a09-b3f3-c1d256da7188","Type":"ContainerDied","Data":"2c02e3209feebb3a5b57d744091e61856b5140bf956b85dd484eabcb576c3fc9"} Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.107955 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c02e3209feebb3a5b57d744091e61856b5140bf956b85dd484eabcb576c3fc9" Nov 29 05:44:13 crc kubenswrapper[4594]: I1129 05:44:13.108025 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b5e5-account-create-update-tqj9s" Nov 29 05:44:27 crc kubenswrapper[4594]: E1129 05:44:27.223208 4594 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-watcher-api:current" Nov 29 05:44:27 crc kubenswrapper[4594]: E1129 05:44:27.223936 4594 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-watcher-api:current" Nov 29 05:44:27 crc kubenswrapper[4594]: E1129 05:44:27.224128 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-watcher-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jw2j2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-db-sync-bg92f_openstack(7474ba72-1534-4193-9590-e46dfc840403): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 05:44:27 crc kubenswrapper[4594]: E1129 05:44:27.225391 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/watcher-db-sync-bg92f" podUID="7474ba72-1534-4193-9590-e46dfc840403" Nov 29 05:44:27 crc kubenswrapper[4594]: E1129 05:44:27.283872 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-watcher-api:current\\\"\"" pod="openstack/watcher-db-sync-bg92f" podUID="7474ba72-1534-4193-9590-e46dfc840403" Nov 29 05:44:28 crc kubenswrapper[4594]: I1129 05:44:28.291030 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ww7dv" event={"ID":"98e4a7be-f230-4732-9a53-d258bf31954b","Type":"ContainerStarted","Data":"d30fbe70e618e541993ac133419e99711234f48de67a5c7d4638e5870e351808"} Nov 29 05:44:28 crc kubenswrapper[4594]: I1129 05:44:28.293425 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-67zkn" event={"ID":"2382e225-37e5-4fa9-97df-0d581b97dd01","Type":"ContainerStarted","Data":"71e9d608f105a521b8ad0af0cd233ee11eaf2e5e790481f8a6cb2dda0b9c95a3"} Nov 29 05:44:28 crc kubenswrapper[4594]: I1129 05:44:28.316299 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ww7dv" podStartSLOduration=2.055297735 podStartE2EDuration="23.316287652s" podCreationTimestamp="2025-11-29 05:44:05 +0000 UTC" firstStartedPulling="2025-11-29 05:44:06.002488644 +0000 UTC m=+970.242997865" lastFinishedPulling="2025-11-29 05:44:27.263478562 +0000 UTC m=+991.503987782" observedRunningTime="2025-11-29 05:44:28.306943554 +0000 UTC m=+992.547452784" watchObservedRunningTime="2025-11-29 05:44:28.316287652 +0000 UTC m=+992.556796872" Nov 29 05:44:28 crc kubenswrapper[4594]: I1129 05:44:28.325454 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-67zkn" podStartSLOduration=2.931094132 podStartE2EDuration="20.325448736s" podCreationTimestamp="2025-11-29 05:44:08 +0000 UTC" firstStartedPulling="2025-11-29 05:44:09.84336995 +0000 UTC m=+974.083879170" lastFinishedPulling="2025-11-29 05:44:27.237724553 +0000 UTC m=+991.478233774" observedRunningTime="2025-11-29 05:44:28.323185278 +0000 UTC m=+992.563694498" watchObservedRunningTime="2025-11-29 05:44:28.325448736 +0000 UTC m=+992.565957956" Nov 29 05:44:30 crc kubenswrapper[4594]: I1129 05:44:30.311029 4594 generic.go:334] "Generic (PLEG): container finished" podID="2382e225-37e5-4fa9-97df-0d581b97dd01" containerID="71e9d608f105a521b8ad0af0cd233ee11eaf2e5e790481f8a6cb2dda0b9c95a3" exitCode=0 Nov 29 05:44:30 crc kubenswrapper[4594]: I1129 05:44:30.311228 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-67zkn" event={"ID":"2382e225-37e5-4fa9-97df-0d581b97dd01","Type":"ContainerDied","Data":"71e9d608f105a521b8ad0af0cd233ee11eaf2e5e790481f8a6cb2dda0b9c95a3"} Nov 29 05:44:31 crc kubenswrapper[4594]: I1129 05:44:31.624786 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-67zkn" Nov 29 05:44:31 crc kubenswrapper[4594]: I1129 05:44:31.808899 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2382e225-37e5-4fa9-97df-0d581b97dd01-combined-ca-bundle\") pod \"2382e225-37e5-4fa9-97df-0d581b97dd01\" (UID: \"2382e225-37e5-4fa9-97df-0d581b97dd01\") " Nov 29 05:44:31 crc kubenswrapper[4594]: I1129 05:44:31.808955 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78qks\" (UniqueName: \"kubernetes.io/projected/2382e225-37e5-4fa9-97df-0d581b97dd01-kube-api-access-78qks\") pod \"2382e225-37e5-4fa9-97df-0d581b97dd01\" (UID: \"2382e225-37e5-4fa9-97df-0d581b97dd01\") " Nov 29 05:44:31 crc kubenswrapper[4594]: I1129 05:44:31.808990 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2382e225-37e5-4fa9-97df-0d581b97dd01-config-data\") pod \"2382e225-37e5-4fa9-97df-0d581b97dd01\" (UID: \"2382e225-37e5-4fa9-97df-0d581b97dd01\") " Nov 29 05:44:31 crc kubenswrapper[4594]: I1129 05:44:31.825052 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2382e225-37e5-4fa9-97df-0d581b97dd01-kube-api-access-78qks" (OuterVolumeSpecName: "kube-api-access-78qks") pod "2382e225-37e5-4fa9-97df-0d581b97dd01" (UID: "2382e225-37e5-4fa9-97df-0d581b97dd01"). InnerVolumeSpecName "kube-api-access-78qks". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:31 crc kubenswrapper[4594]: I1129 05:44:31.830666 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2382e225-37e5-4fa9-97df-0d581b97dd01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2382e225-37e5-4fa9-97df-0d581b97dd01" (UID: "2382e225-37e5-4fa9-97df-0d581b97dd01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:31 crc kubenswrapper[4594]: I1129 05:44:31.849151 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2382e225-37e5-4fa9-97df-0d581b97dd01-config-data" (OuterVolumeSpecName: "config-data") pod "2382e225-37e5-4fa9-97df-0d581b97dd01" (UID: "2382e225-37e5-4fa9-97df-0d581b97dd01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:31 crc kubenswrapper[4594]: I1129 05:44:31.911483 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2382e225-37e5-4fa9-97df-0d581b97dd01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:31 crc kubenswrapper[4594]: I1129 05:44:31.911524 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78qks\" (UniqueName: \"kubernetes.io/projected/2382e225-37e5-4fa9-97df-0d581b97dd01-kube-api-access-78qks\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:31 crc kubenswrapper[4594]: I1129 05:44:31.911540 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2382e225-37e5-4fa9-97df-0d581b97dd01-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.330919 4594 generic.go:334] "Generic (PLEG): container finished" podID="98e4a7be-f230-4732-9a53-d258bf31954b" containerID="d30fbe70e618e541993ac133419e99711234f48de67a5c7d4638e5870e351808" exitCode=0 Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.330989 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ww7dv" event={"ID":"98e4a7be-f230-4732-9a53-d258bf31954b","Type":"ContainerDied","Data":"d30fbe70e618e541993ac133419e99711234f48de67a5c7d4638e5870e351808"} Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.332919 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-67zkn" event={"ID":"2382e225-37e5-4fa9-97df-0d581b97dd01","Type":"ContainerDied","Data":"7ca7f5ee5a5c1c516870a54b4562e5fd2911e8d91ac9e87c7342c35717e57447"} Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.332962 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-67zkn" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.332980 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ca7f5ee5a5c1c516870a54b4562e5fd2911e8d91ac9e87c7342c35717e57447" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.896531 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58bbf48b7f-xvwrv"] Nov 29 05:44:32 crc kubenswrapper[4594]: E1129 05:44:32.897087 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11684277-b949-4e98-8e76-e0780cfb4d32" containerName="mariadb-account-create-update" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897106 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="11684277-b949-4e98-8e76-e0780cfb4d32" containerName="mariadb-account-create-update" Nov 29 05:44:32 crc kubenswrapper[4594]: E1129 05:44:32.897124 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb711bc-c899-4a09-b3f3-c1d256da7188" containerName="mariadb-account-create-update" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897131 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb711bc-c899-4a09-b3f3-c1d256da7188" containerName="mariadb-account-create-update" Nov 29 05:44:32 crc kubenswrapper[4594]: E1129 05:44:32.897146 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2382e225-37e5-4fa9-97df-0d581b97dd01" containerName="keystone-db-sync" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897153 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="2382e225-37e5-4fa9-97df-0d581b97dd01" containerName="keystone-db-sync" Nov 29 05:44:32 crc kubenswrapper[4594]: E1129 05:44:32.897170 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee64b887-a184-43e5-85ad-df3e76919044" containerName="mariadb-database-create" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897177 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee64b887-a184-43e5-85ad-df3e76919044" containerName="mariadb-database-create" Nov 29 05:44:32 crc kubenswrapper[4594]: E1129 05:44:32.897183 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e1365bf-4433-4641-910f-0d7a92be4fc1" containerName="mariadb-account-create-update" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897191 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e1365bf-4433-4641-910f-0d7a92be4fc1" containerName="mariadb-account-create-update" Nov 29 05:44:32 crc kubenswrapper[4594]: E1129 05:44:32.897200 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a4a0787-c2fa-43c0-9c7e-98cbdb79134b" containerName="mariadb-database-create" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897205 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a4a0787-c2fa-43c0-9c7e-98cbdb79134b" containerName="mariadb-database-create" Nov 29 05:44:32 crc kubenswrapper[4594]: E1129 05:44:32.897214 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111c210f-a9a4-46ea-93ce-e91bae7ef7a9" containerName="mariadb-database-create" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897220 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="111c210f-a9a4-46ea-93ce-e91bae7ef7a9" containerName="mariadb-database-create" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897447 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a4a0787-c2fa-43c0-9c7e-98cbdb79134b" containerName="mariadb-database-create" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897466 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="111c210f-a9a4-46ea-93ce-e91bae7ef7a9" containerName="mariadb-database-create" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897477 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb711bc-c899-4a09-b3f3-c1d256da7188" containerName="mariadb-account-create-update" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897485 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="2382e225-37e5-4fa9-97df-0d581b97dd01" containerName="keystone-db-sync" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897495 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="11684277-b949-4e98-8e76-e0780cfb4d32" containerName="mariadb-account-create-update" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897505 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee64b887-a184-43e5-85ad-df3e76919044" containerName="mariadb-database-create" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.897515 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e1365bf-4433-4641-910f-0d7a92be4fc1" containerName="mariadb-account-create-update" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.902030 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.927943 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58bbf48b7f-xvwrv"] Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.951305 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-749tk"] Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.953074 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.956631 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.956829 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.956836 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nz5kq" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.957103 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.957241 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 05:44:32 crc kubenswrapper[4594]: I1129 05:44:32.991327 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-749tk"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.035497 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-dns-svc\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.035610 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv5zw\" (UniqueName: \"kubernetes.io/projected/6c5f9dc4-23c7-4223-8c63-44a15905edf5-kube-api-access-zv5zw\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.035727 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-ovsdbserver-sb\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.035852 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-ovsdbserver-nb\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.035880 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-config\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.035900 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-dns-swift-storage-0\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.114359 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7b4h2"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.116042 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.122295 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.123032 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.123294 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8hbsc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.138210 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-scripts\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.138461 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-config-data\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.138512 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-config\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.138532 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-ovsdbserver-nb\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.138553 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-dns-swift-storage-0\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.138596 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsjmh\" (UniqueName: \"kubernetes.io/projected/fc29fae8-4b90-4702-8c68-4cec263fd4c9-kube-api-access-zsjmh\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.138644 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-combined-ca-bundle\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.138678 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-credential-keys\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.138717 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-dns-svc\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.138787 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv5zw\" (UniqueName: \"kubernetes.io/projected/6c5f9dc4-23c7-4223-8c63-44a15905edf5-kube-api-access-zv5zw\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.138833 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-fernet-keys\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.138880 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-ovsdbserver-sb\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.139474 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-config\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.144975 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-dns-svc\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.145034 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-ovsdbserver-sb\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.145829 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-ovsdbserver-nb\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.145885 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7bf6f9df7f-knsk7"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.146030 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-dns-swift-storage-0\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.147498 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.154582 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.154771 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.154986 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.155206 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-g227t" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.207637 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7b4h2"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.244216 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-config-data\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.256404 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsjmh\" (UniqueName: \"kubernetes.io/projected/fc29fae8-4b90-4702-8c68-4cec263fd4c9-kube-api-access-zsjmh\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.256492 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/968317f6-c4d9-4647-a166-0cadc0fa57f2-config\") pod \"neutron-db-sync-7b4h2\" (UID: \"968317f6-c4d9-4647-a166-0cadc0fa57f2\") " pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.256519 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-combined-ca-bundle\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.256553 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-credential-keys\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.256752 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-fernet-keys\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.256853 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp9pp\" (UniqueName: \"kubernetes.io/projected/968317f6-c4d9-4647-a166-0cadc0fa57f2-kube-api-access-tp9pp\") pod \"neutron-db-sync-7b4h2\" (UID: \"968317f6-c4d9-4647-a166-0cadc0fa57f2\") " pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.256950 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-scripts\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.257011 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968317f6-c4d9-4647-a166-0cadc0fa57f2-combined-ca-bundle\") pod \"neutron-db-sync-7b4h2\" (UID: \"968317f6-c4d9-4647-a166-0cadc0fa57f2\") " pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.260227 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv5zw\" (UniqueName: \"kubernetes.io/projected/6c5f9dc4-23c7-4223-8c63-44a15905edf5-kube-api-access-zv5zw\") pod \"dnsmasq-dns-58bbf48b7f-xvwrv\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.277150 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-config-data\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.284476 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bf6f9df7f-knsk7"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.285233 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-combined-ca-bundle\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.287823 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-fernet-keys\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.320690 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9rddc"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.325572 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-scripts\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.328714 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-credential-keys\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.339440 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.360236 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/968317f6-c4d9-4647-a166-0cadc0fa57f2-config\") pod \"neutron-db-sync-7b4h2\" (UID: \"968317f6-c4d9-4647-a166-0cadc0fa57f2\") " pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.360293 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c4f4aa8-f25f-4684-8707-3bc0eb168954-scripts\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.360345 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nbzs\" (UniqueName: \"kubernetes.io/projected/9c4f4aa8-f25f-4684-8707-3bc0eb168954-kube-api-access-6nbzs\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.360399 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c4f4aa8-f25f-4684-8707-3bc0eb168954-horizon-secret-key\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.360485 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c4f4aa8-f25f-4684-8707-3bc0eb168954-logs\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.361338 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp9pp\" (UniqueName: \"kubernetes.io/projected/968317f6-c4d9-4647-a166-0cadc0fa57f2-kube-api-access-tp9pp\") pod \"neutron-db-sync-7b4h2\" (UID: \"968317f6-c4d9-4647-a166-0cadc0fa57f2\") " pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.361492 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c4f4aa8-f25f-4684-8707-3bc0eb168954-config-data\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.361542 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968317f6-c4d9-4647-a166-0cadc0fa57f2-combined-ca-bundle\") pod \"neutron-db-sync-7b4h2\" (UID: \"968317f6-c4d9-4647-a166-0cadc0fa57f2\") " pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.371930 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/968317f6-c4d9-4647-a166-0cadc0fa57f2-config\") pod \"neutron-db-sync-7b4h2\" (UID: \"968317f6-c4d9-4647-a166-0cadc0fa57f2\") " pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.372914 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968317f6-c4d9-4647-a166-0cadc0fa57f2-combined-ca-bundle\") pod \"neutron-db-sync-7b4h2\" (UID: \"968317f6-c4d9-4647-a166-0cadc0fa57f2\") " pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.373505 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.373716 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-v5t9v" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.384463 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9rddc"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.390815 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.414011 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsjmh\" (UniqueName: \"kubernetes.io/projected/fc29fae8-4b90-4702-8c68-4cec263fd4c9-kube-api-access-zsjmh\") pod \"keystone-bootstrap-749tk\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.431769 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp9pp\" (UniqueName: \"kubernetes.io/projected/968317f6-c4d9-4647-a166-0cadc0fa57f2-kube-api-access-tp9pp\") pod \"neutron-db-sync-7b4h2\" (UID: \"968317f6-c4d9-4647-a166-0cadc0fa57f2\") " pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.442271 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.444519 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.447795 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.447806 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.448640 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.465308 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.468035 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c4f4aa8-f25f-4684-8707-3bc0eb168954-scripts\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.468102 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nbzs\" (UniqueName: \"kubernetes.io/projected/9c4f4aa8-f25f-4684-8707-3bc0eb168954-kube-api-access-6nbzs\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.468129 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-combined-ca-bundle\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.468172 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26m6r\" (UniqueName: \"kubernetes.io/projected/4b46224c-7874-4c4a-abb6-f1cbef3a8462-kube-api-access-26m6r\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.468195 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c4f4aa8-f25f-4684-8707-3bc0eb168954-horizon-secret-key\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.468215 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-db-sync-config-data\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.468278 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-scripts\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.468322 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c4f4aa8-f25f-4684-8707-3bc0eb168954-logs\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.468355 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-config-data\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.468409 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c4f4aa8-f25f-4684-8707-3bc0eb168954-config-data\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.468446 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b46224c-7874-4c4a-abb6-f1cbef3a8462-etc-machine-id\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.469158 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c4f4aa8-f25f-4684-8707-3bc0eb168954-scripts\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.469982 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c4f4aa8-f25f-4684-8707-3bc0eb168954-logs\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.475436 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c4f4aa8-f25f-4684-8707-3bc0eb168954-config-data\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.478307 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bbf48b7f-xvwrv"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.479128 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.484290 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8jkv9"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.485558 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.491020 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-578598f949-zv6mw"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.499079 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c4f4aa8-f25f-4684-8707-3bc0eb168954-horizon-secret-key\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.494925 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-m8j6t" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.495078 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.495110 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.517909 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-658d8f8767-q5w5m"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.519437 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.521297 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nbzs\" (UniqueName: \"kubernetes.io/projected/9c4f4aa8-f25f-4684-8707-3bc0eb168954-kube-api-access-6nbzs\") pod \"horizon-7bf6f9df7f-knsk7\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.534069 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8jkv9"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.534182 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578598f949-zv6mw"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.534826 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570048 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-config-data\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570098 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfe84a63-fea5-455e-95d3-523a091b976f-run-httpd\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570147 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfe84a63-fea5-455e-95d3-523a091b976f-log-httpd\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570186 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-config-data\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570202 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570231 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b46224c-7874-4c4a-abb6-f1cbef3a8462-etc-machine-id\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570287 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570313 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-scripts\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570377 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-scripts\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570397 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-config-data\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570424 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-combined-ca-bundle\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570465 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26m6r\" (UniqueName: \"kubernetes.io/projected/4b46224c-7874-4c4a-abb6-f1cbef3a8462-kube-api-access-26m6r\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570497 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69498abf-8b80-4b7f-901a-4c6a4bdede2f-logs\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570515 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-combined-ca-bundle\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570539 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-db-sync-config-data\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570560 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnlgg\" (UniqueName: \"kubernetes.io/projected/dfe84a63-fea5-455e-95d3-523a091b976f-kube-api-access-fnlgg\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570600 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk2n4\" (UniqueName: \"kubernetes.io/projected/69498abf-8b80-4b7f-901a-4c6a4bdede2f-kube-api-access-wk2n4\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.570629 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-scripts\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.577196 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-scripts\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.580117 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b46224c-7874-4c4a-abb6-f1cbef3a8462-etc-machine-id\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.581891 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-db-sync-config-data\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.585034 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-config-data\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.585514 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-combined-ca-bundle\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.591309 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-658d8f8767-q5w5m"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.600488 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.612981 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-mvvfl"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.616477 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mvvfl" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.620726 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.625954 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26m6r\" (UniqueName: \"kubernetes.io/projected/4b46224c-7874-4c4a-abb6-f1cbef3a8462-kube-api-access-26m6r\") pod \"cinder-db-sync-9rddc\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.627891 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wl49q" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.633969 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mvvfl"] Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.671814 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-dns-svc\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.671858 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfe84a63-fea5-455e-95d3-523a091b976f-log-httpd\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.671884 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-config-data\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.671902 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.673717 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfe84a63-fea5-455e-95d3-523a091b976f-log-httpd\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674124 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e786ace8-0d46-488a-941c-2325002c5edc-horizon-secret-key\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674178 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-ovsdbserver-sb\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674235 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674275 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-scripts\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674296 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-dns-swift-storage-0\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674316 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ztlj\" (UniqueName: \"kubernetes.io/projected/e786ace8-0d46-488a-941c-2325002c5edc-kube-api-access-2ztlj\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674364 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-scripts\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674386 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-config-data\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674413 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e786ace8-0d46-488a-941c-2325002c5edc-scripts\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674445 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e786ace8-0d46-488a-941c-2325002c5edc-config-data\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674490 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69498abf-8b80-4b7f-901a-4c6a4bdede2f-logs\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674507 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-combined-ca-bundle\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674526 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v7td\" (UniqueName: \"kubernetes.io/projected/b0de6056-30e4-4acc-bfbc-199c27065ff6-kube-api-access-6v7td\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674554 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnlgg\" (UniqueName: \"kubernetes.io/projected/dfe84a63-fea5-455e-95d3-523a091b976f-kube-api-access-fnlgg\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674592 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk2n4\" (UniqueName: \"kubernetes.io/projected/69498abf-8b80-4b7f-901a-4c6a4bdede2f-kube-api-access-wk2n4\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674641 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-ovsdbserver-nb\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674677 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-config\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674693 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e786ace8-0d46-488a-941c-2325002c5edc-logs\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.674719 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfe84a63-fea5-455e-95d3-523a091b976f-run-httpd\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.675153 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfe84a63-fea5-455e-95d3-523a091b976f-run-httpd\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.675945 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69498abf-8b80-4b7f-901a-4c6a4bdede2f-logs\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.677428 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-config-data\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.689118 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-scripts\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.702647 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnlgg\" (UniqueName: \"kubernetes.io/projected/dfe84a63-fea5-455e-95d3-523a091b976f-kube-api-access-fnlgg\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.705868 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-combined-ca-bundle\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.705932 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-scripts\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.706001 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.707870 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.707921 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-config-data\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.712419 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk2n4\" (UniqueName: \"kubernetes.io/projected/69498abf-8b80-4b7f-901a-4c6a4bdede2f-kube-api-access-wk2n4\") pod \"placement-db-sync-8jkv9\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.734422 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9rddc" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.767071 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.773880 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776297 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v7td\" (UniqueName: \"kubernetes.io/projected/b0de6056-30e4-4acc-bfbc-199c27065ff6-kube-api-access-6v7td\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776361 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-ovsdbserver-nb\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776390 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-config\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776405 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e786ace8-0d46-488a-941c-2325002c5edc-logs\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776433 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-dns-svc\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776468 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e786ace8-0d46-488a-941c-2325002c5edc-horizon-secret-key\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776482 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-ovsdbserver-sb\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776516 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72edd67c-f301-4e33-8d71-74f9bc7b99c5-db-sync-config-data\") pod \"barbican-db-sync-mvvfl\" (UID: \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\") " pod="openstack/barbican-db-sync-mvvfl" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776543 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-dns-swift-storage-0\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776560 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ztlj\" (UniqueName: \"kubernetes.io/projected/e786ace8-0d46-488a-941c-2325002c5edc-kube-api-access-2ztlj\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776599 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72edd67c-f301-4e33-8d71-74f9bc7b99c5-combined-ca-bundle\") pod \"barbican-db-sync-mvvfl\" (UID: \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\") " pod="openstack/barbican-db-sync-mvvfl" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776616 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdcpx\" (UniqueName: \"kubernetes.io/projected/72edd67c-f301-4e33-8d71-74f9bc7b99c5-kube-api-access-gdcpx\") pod \"barbican-db-sync-mvvfl\" (UID: \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\") " pod="openstack/barbican-db-sync-mvvfl" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776636 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e786ace8-0d46-488a-941c-2325002c5edc-config-data\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.776649 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e786ace8-0d46-488a-941c-2325002c5edc-scripts\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.778347 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e786ace8-0d46-488a-941c-2325002c5edc-scripts\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.778637 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-ovsdbserver-sb\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.781701 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e786ace8-0d46-488a-941c-2325002c5edc-logs\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.784748 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e786ace8-0d46-488a-941c-2325002c5edc-horizon-secret-key\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.786727 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-ovsdbserver-nb\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.786960 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-config\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.787645 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e786ace8-0d46-488a-941c-2325002c5edc-config-data\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.790550 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-dns-swift-storage-0\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.790835 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-dns-svc\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.801593 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ztlj\" (UniqueName: \"kubernetes.io/projected/e786ace8-0d46-488a-941c-2325002c5edc-kube-api-access-2ztlj\") pod \"horizon-658d8f8767-q5w5m\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.811059 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v7td\" (UniqueName: \"kubernetes.io/projected/b0de6056-30e4-4acc-bfbc-199c27065ff6-kube-api-access-6v7td\") pod \"dnsmasq-dns-578598f949-zv6mw\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.823966 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8jkv9" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.873078 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.879465 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72edd67c-f301-4e33-8d71-74f9bc7b99c5-db-sync-config-data\") pod \"barbican-db-sync-mvvfl\" (UID: \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\") " pod="openstack/barbican-db-sync-mvvfl" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.879529 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72edd67c-f301-4e33-8d71-74f9bc7b99c5-combined-ca-bundle\") pod \"barbican-db-sync-mvvfl\" (UID: \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\") " pod="openstack/barbican-db-sync-mvvfl" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.879556 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdcpx\" (UniqueName: \"kubernetes.io/projected/72edd67c-f301-4e33-8d71-74f9bc7b99c5-kube-api-access-gdcpx\") pod \"barbican-db-sync-mvvfl\" (UID: \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\") " pod="openstack/barbican-db-sync-mvvfl" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.893337 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.893554 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72edd67c-f301-4e33-8d71-74f9bc7b99c5-combined-ca-bundle\") pod \"barbican-db-sync-mvvfl\" (UID: \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\") " pod="openstack/barbican-db-sync-mvvfl" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.898603 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72edd67c-f301-4e33-8d71-74f9bc7b99c5-db-sync-config-data\") pod \"barbican-db-sync-mvvfl\" (UID: \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\") " pod="openstack/barbican-db-sync-mvvfl" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.902431 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdcpx\" (UniqueName: \"kubernetes.io/projected/72edd67c-f301-4e33-8d71-74f9bc7b99c5-kube-api-access-gdcpx\") pod \"barbican-db-sync-mvvfl\" (UID: \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\") " pod="openstack/barbican-db-sync-mvvfl" Nov 29 05:44:33 crc kubenswrapper[4594]: I1129 05:44:33.934819 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mvvfl" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.057243 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.188287 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-combined-ca-bundle\") pod \"98e4a7be-f230-4732-9a53-d258bf31954b\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.188737 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-config-data\") pod \"98e4a7be-f230-4732-9a53-d258bf31954b\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.188876 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-db-sync-config-data\") pod \"98e4a7be-f230-4732-9a53-d258bf31954b\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.188907 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf245\" (UniqueName: \"kubernetes.io/projected/98e4a7be-f230-4732-9a53-d258bf31954b-kube-api-access-hf245\") pod \"98e4a7be-f230-4732-9a53-d258bf31954b\" (UID: \"98e4a7be-f230-4732-9a53-d258bf31954b\") " Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.194029 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "98e4a7be-f230-4732-9a53-d258bf31954b" (UID: "98e4a7be-f230-4732-9a53-d258bf31954b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.196149 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e4a7be-f230-4732-9a53-d258bf31954b-kube-api-access-hf245" (OuterVolumeSpecName: "kube-api-access-hf245") pod "98e4a7be-f230-4732-9a53-d258bf31954b" (UID: "98e4a7be-f230-4732-9a53-d258bf31954b"). InnerVolumeSpecName "kube-api-access-hf245". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.213996 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98e4a7be-f230-4732-9a53-d258bf31954b" (UID: "98e4a7be-f230-4732-9a53-d258bf31954b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.228447 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-config-data" (OuterVolumeSpecName: "config-data") pod "98e4a7be-f230-4732-9a53-d258bf31954b" (UID: "98e4a7be-f230-4732-9a53-d258bf31954b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.291812 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.291840 4594 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.291852 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf245\" (UniqueName: \"kubernetes.io/projected/98e4a7be-f230-4732-9a53-d258bf31954b-kube-api-access-hf245\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.291860 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e4a7be-f230-4732-9a53-d258bf31954b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.358775 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7b4h2"] Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.386315 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ww7dv" event={"ID":"98e4a7be-f230-4732-9a53-d258bf31954b","Type":"ContainerDied","Data":"d3a2dd92d62aa3975946694c53c568cfbcdbffc5583766adc8039a73c99aef28"} Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.386432 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3a2dd92d62aa3975946694c53c568cfbcdbffc5583766adc8039a73c99aef28" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.386548 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ww7dv" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.399299 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bbf48b7f-xvwrv"] Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.411942 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.570278 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-749tk"] Nov 29 05:44:34 crc kubenswrapper[4594]: W1129 05:44:34.584585 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc29fae8_4b90_4702_8c68_4cec263fd4c9.slice/crio-b91318f59f3088d4e9b1654661b4c8d2c69090ccb915c08b7fccaebd4de83d4c WatchSource:0}: Error finding container b91318f59f3088d4e9b1654661b4c8d2c69090ccb915c08b7fccaebd4de83d4c: Status 404 returned error can't find the container with id b91318f59f3088d4e9b1654661b4c8d2c69090ccb915c08b7fccaebd4de83d4c Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.626315 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bf6f9df7f-knsk7"] Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.650290 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8jkv9"] Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.682767 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9rddc"] Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.800346 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mvvfl"] Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.819509 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-658d8f8767-q5w5m"] Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.836879 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578598f949-zv6mw"] Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.854968 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578598f949-zv6mw"] Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.876289 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cf77b4997-stk57"] Nov 29 05:44:34 crc kubenswrapper[4594]: E1129 05:44:34.876789 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e4a7be-f230-4732-9a53-d258bf31954b" containerName="glance-db-sync" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.876805 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e4a7be-f230-4732-9a53-d258bf31954b" containerName="glance-db-sync" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.876999 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e4a7be-f230-4732-9a53-d258bf31954b" containerName="glance-db-sync" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.878040 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.885638 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cf77b4997-stk57"] Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.936700 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-ovsdbserver-sb\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.936787 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-dns-svc\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.936831 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-dns-swift-storage-0\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.936915 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-ovsdbserver-nb\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.936942 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-config\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:34 crc kubenswrapper[4594]: I1129 05:44:34.937025 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7l2r\" (UniqueName: \"kubernetes.io/projected/3536563a-1e9e-458e-ace5-acbe9e090404-kube-api-access-g7l2r\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.040856 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-dns-swift-storage-0\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.040965 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-ovsdbserver-nb\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.040996 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-config\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.041121 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7l2r\" (UniqueName: \"kubernetes.io/projected/3536563a-1e9e-458e-ace5-acbe9e090404-kube-api-access-g7l2r\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.041290 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-ovsdbserver-sb\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.041373 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-dns-svc\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.041777 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-ovsdbserver-nb\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.042054 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-config\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.046101 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-dns-svc\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.047741 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-ovsdbserver-sb\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.048235 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-dns-swift-storage-0\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.075343 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7l2r\" (UniqueName: \"kubernetes.io/projected/3536563a-1e9e-458e-ace5-acbe9e090404-kube-api-access-g7l2r\") pod \"dnsmasq-dns-7cf77b4997-stk57\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.198223 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.329939 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-658d8f8767-q5w5m"] Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.370171 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.396932 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6cf4cc7b4f-hg8dz"] Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.398505 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.412658 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cf4cc7b4f-hg8dz"] Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.453025 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8jkv9" event={"ID":"69498abf-8b80-4b7f-901a-4c6a4bdede2f","Type":"ContainerStarted","Data":"cd5efaf7de51bc1e57db373e421f8c0b47d6f4f9afeabb3b08a725191e5f4601"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.462479 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fvwr\" (UniqueName: \"kubernetes.io/projected/9ac3735b-d8a9-4723-8576-dc16c7af5756-kube-api-access-6fvwr\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.462522 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ac3735b-d8a9-4723-8576-dc16c7af5756-config-data\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.462708 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ac3735b-d8a9-4723-8576-dc16c7af5756-horizon-secret-key\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.462785 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ac3735b-d8a9-4723-8576-dc16c7af5756-scripts\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.462902 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ac3735b-d8a9-4723-8576-dc16c7af5756-logs\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.485525 4594 generic.go:334] "Generic (PLEG): container finished" podID="6c5f9dc4-23c7-4223-8c63-44a15905edf5" containerID="d6bedcedf64006c5845032e8eb9885ce6a2a466ae24a3a81ac1e1321871a3e2d" exitCode=0 Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.486690 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" event={"ID":"6c5f9dc4-23c7-4223-8c63-44a15905edf5","Type":"ContainerDied","Data":"d6bedcedf64006c5845032e8eb9885ce6a2a466ae24a3a81ac1e1321871a3e2d"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.486742 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" event={"ID":"6c5f9dc4-23c7-4223-8c63-44a15905edf5","Type":"ContainerStarted","Data":"337f8b76f90f6ae57e4dba84110cedf977c45360b5e061b6d1be9b468dc4d9df"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.526829 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.528585 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.555009 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rtrkc" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.555311 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.557377 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7b4h2" event={"ID":"968317f6-c4d9-4647-a166-0cadc0fa57f2","Type":"ContainerStarted","Data":"fa4d2c563b6a5c3ba1aea016672f64a1203e1046e169fdafa873d6718fac82e8"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.557414 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7b4h2" event={"ID":"968317f6-c4d9-4647-a166-0cadc0fa57f2","Type":"ContainerStarted","Data":"963d0315079b92d420e2feaa6f31a0549cd2377f92c8e120cf226885fc66651a"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.562290 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.568269 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ac3735b-d8a9-4723-8576-dc16c7af5756-horizon-secret-key\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.568338 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ac3735b-d8a9-4723-8576-dc16c7af5756-scripts\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.568439 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ac3735b-d8a9-4723-8576-dc16c7af5756-logs\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.568473 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fvwr\" (UniqueName: \"kubernetes.io/projected/9ac3735b-d8a9-4723-8576-dc16c7af5756-kube-api-access-6fvwr\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.568492 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ac3735b-d8a9-4723-8576-dc16c7af5756-config-data\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.569577 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ac3735b-d8a9-4723-8576-dc16c7af5756-config-data\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.579964 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.580279 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ac3735b-d8a9-4723-8576-dc16c7af5756-logs\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.580540 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ac3735b-d8a9-4723-8576-dc16c7af5756-scripts\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.607329 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-658d8f8767-q5w5m" event={"ID":"e786ace8-0d46-488a-941c-2325002c5edc","Type":"ContainerStarted","Data":"9051a9b51962ea17a7205fa4a01dffed74a141ef5047be3a558c4852eb32d254"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.607489 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ac3735b-d8a9-4723-8576-dc16c7af5756-horizon-secret-key\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.617158 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fvwr\" (UniqueName: \"kubernetes.io/projected/9ac3735b-d8a9-4723-8576-dc16c7af5756-kube-api-access-6fvwr\") pod \"horizon-6cf4cc7b4f-hg8dz\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.632359 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.636229 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.642621 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfe84a63-fea5-455e-95d3-523a091b976f","Type":"ContainerStarted","Data":"9bc0dd9c240581d8a29dee5bc3e11733a20ba6a528c2347a613cb60b099e44e6"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.646702 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.656403 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.660930 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7b4h2" podStartSLOduration=2.6609089089999998 podStartE2EDuration="2.660908909s" podCreationTimestamp="2025-11-29 05:44:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:44:35.637584711 +0000 UTC m=+999.878093931" watchObservedRunningTime="2025-11-29 05:44:35.660908909 +0000 UTC m=+999.901418128" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.665571 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-749tk" event={"ID":"fc29fae8-4b90-4702-8c68-4cec263fd4c9","Type":"ContainerStarted","Data":"5bef5fd16ac3e60fef2c472e9c97a102398cb34978c0ec31423c236aaa1eca2b"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.665610 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-749tk" event={"ID":"fc29fae8-4b90-4702-8c68-4cec263fd4c9","Type":"ContainerStarted","Data":"b91318f59f3088d4e9b1654661b4c8d2c69090ccb915c08b7fccaebd4de83d4c"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.671150 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b204dff1-e3d8-467f-b95f-372c0abbb1de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.671275 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4qw9\" (UniqueName: \"kubernetes.io/projected/b204dff1-e3d8-467f-b95f-372c0abbb1de-kube-api-access-n4qw9\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.671382 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-config-data\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.671552 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b204dff1-e3d8-467f-b95f-372c0abbb1de-logs\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.671591 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-scripts\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.671718 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.672098 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.698988 4594 generic.go:334] "Generic (PLEG): container finished" podID="b0de6056-30e4-4acc-bfbc-199c27065ff6" containerID="abd06106b64c817022fb5eb3a9edfe51ad0a79acd3cff4fc1da4fcc3581df70e" exitCode=0 Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.699092 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578598f949-zv6mw" event={"ID":"b0de6056-30e4-4acc-bfbc-199c27065ff6","Type":"ContainerDied","Data":"abd06106b64c817022fb5eb3a9edfe51ad0a79acd3cff4fc1da4fcc3581df70e"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.699121 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578598f949-zv6mw" event={"ID":"b0de6056-30e4-4acc-bfbc-199c27065ff6","Type":"ContainerStarted","Data":"2420748f7578897c79ae18d52bd25430a7e3e5b61773d8f9c23d1a9bdab796b3"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.706721 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9rddc" event={"ID":"4b46224c-7874-4c4a-abb6-f1cbef3a8462","Type":"ContainerStarted","Data":"ec150e75644d27f26f84fef87057b1ec0dd6566f679884e2c1d94cf46eec4bc4"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.720489 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.724804 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-749tk" podStartSLOduration=3.724779234 podStartE2EDuration="3.724779234s" podCreationTimestamp="2025-11-29 05:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:44:35.707221314 +0000 UTC m=+999.947730533" watchObservedRunningTime="2025-11-29 05:44:35.724779234 +0000 UTC m=+999.965288454" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.735749 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mvvfl" event={"ID":"72edd67c-f301-4e33-8d71-74f9bc7b99c5","Type":"ContainerStarted","Data":"6b4757af3b10e2441691669b25adccf3ac4e99040c80dd8a9c795dade6cd4e41"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.737673 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bf6f9df7f-knsk7" event={"ID":"9c4f4aa8-f25f-4684-8707-3bc0eb168954","Type":"ContainerStarted","Data":"2a056b1493fe63bc74b73500cafb49fbcff476a094a21907d9cf42bfbd83e8a2"} Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.794575 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.794648 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7abfaea-4592-4e64-bcbc-5fa0331587a8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.794689 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.794716 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckhmx\" (UniqueName: \"kubernetes.io/projected/d7abfaea-4592-4e64-bcbc-5fa0331587a8-kube-api-access-ckhmx\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.794760 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b204dff1-e3d8-467f-b95f-372c0abbb1de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.794790 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4qw9\" (UniqueName: \"kubernetes.io/projected/b204dff1-e3d8-467f-b95f-372c0abbb1de-kube-api-access-n4qw9\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.794842 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-config-data\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.794908 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.794979 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b204dff1-e3d8-467f-b95f-372c0abbb1de-logs\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.795004 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-scripts\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.795071 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.795114 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.795204 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7abfaea-4592-4e64-bcbc-5fa0331587a8-logs\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.795240 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.796386 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b204dff1-e3d8-467f-b95f-372c0abbb1de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.798950 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.799175 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b204dff1-e3d8-467f-b95f-372c0abbb1de-logs\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.809670 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.811386 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-config-data\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.820926 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-scripts\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.833546 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4qw9\" (UniqueName: \"kubernetes.io/projected/b204dff1-e3d8-467f-b95f-372c0abbb1de-kube-api-access-n4qw9\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.848915 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.905989 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.906073 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7abfaea-4592-4e64-bcbc-5fa0331587a8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.906106 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.906139 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckhmx\" (UniqueName: \"kubernetes.io/projected/d7abfaea-4592-4e64-bcbc-5fa0331587a8-kube-api-access-ckhmx\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.906933 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7abfaea-4592-4e64-bcbc-5fa0331587a8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.907370 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.907731 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.908103 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.908310 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7abfaea-4592-4e64-bcbc-5fa0331587a8-logs\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.912692 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7abfaea-4592-4e64-bcbc-5fa0331587a8-logs\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.913276 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.915997 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.930305 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckhmx\" (UniqueName: \"kubernetes.io/projected/d7abfaea-4592-4e64-bcbc-5fa0331587a8-kube-api-access-ckhmx\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.931235 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.948429 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:35 crc kubenswrapper[4594]: I1129 05:44:35.976669 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.007535 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.231557 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.239846 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cf77b4997-stk57"] Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.323446 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-ovsdbserver-sb\") pod \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.323609 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-ovsdbserver-nb\") pod \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.323749 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-config\") pod \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.323794 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-dns-svc\") pod \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.323834 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-dns-swift-storage-0\") pod \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.323865 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv5zw\" (UniqueName: \"kubernetes.io/projected/6c5f9dc4-23c7-4223-8c63-44a15905edf5-kube-api-access-zv5zw\") pod \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\" (UID: \"6c5f9dc4-23c7-4223-8c63-44a15905edf5\") " Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.331613 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c5f9dc4-23c7-4223-8c63-44a15905edf5-kube-api-access-zv5zw" (OuterVolumeSpecName: "kube-api-access-zv5zw") pod "6c5f9dc4-23c7-4223-8c63-44a15905edf5" (UID: "6c5f9dc4-23c7-4223-8c63-44a15905edf5"). InnerVolumeSpecName "kube-api-access-zv5zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.389112 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-config" (OuterVolumeSpecName: "config") pod "6c5f9dc4-23c7-4223-8c63-44a15905edf5" (UID: "6c5f9dc4-23c7-4223-8c63-44a15905edf5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.399299 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c5f9dc4-23c7-4223-8c63-44a15905edf5" (UID: "6c5f9dc4-23c7-4223-8c63-44a15905edf5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.407856 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c5f9dc4-23c7-4223-8c63-44a15905edf5" (UID: "6c5f9dc4-23c7-4223-8c63-44a15905edf5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.418630 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6c5f9dc4-23c7-4223-8c63-44a15905edf5" (UID: "6c5f9dc4-23c7-4223-8c63-44a15905edf5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.423172 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c5f9dc4-23c7-4223-8c63-44a15905edf5" (UID: "6c5f9dc4-23c7-4223-8c63-44a15905edf5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.428108 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.428136 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.428146 4594 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.428158 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv5zw\" (UniqueName: \"kubernetes.io/projected/6c5f9dc4-23c7-4223-8c63-44a15905edf5-kube-api-access-zv5zw\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.428168 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.428178 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c5f9dc4-23c7-4223-8c63-44a15905edf5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.434838 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cf4cc7b4f-hg8dz"] Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.522027 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.634942 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-dns-swift-storage-0\") pod \"b0de6056-30e4-4acc-bfbc-199c27065ff6\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.635522 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-config\") pod \"b0de6056-30e4-4acc-bfbc-199c27065ff6\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.636024 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v7td\" (UniqueName: \"kubernetes.io/projected/b0de6056-30e4-4acc-bfbc-199c27065ff6-kube-api-access-6v7td\") pod \"b0de6056-30e4-4acc-bfbc-199c27065ff6\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.636089 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-ovsdbserver-sb\") pod \"b0de6056-30e4-4acc-bfbc-199c27065ff6\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.636637 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-ovsdbserver-nb\") pod \"b0de6056-30e4-4acc-bfbc-199c27065ff6\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.636726 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-dns-svc\") pod \"b0de6056-30e4-4acc-bfbc-199c27065ff6\" (UID: \"b0de6056-30e4-4acc-bfbc-199c27065ff6\") " Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.649529 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0de6056-30e4-4acc-bfbc-199c27065ff6-kube-api-access-6v7td" (OuterVolumeSpecName: "kube-api-access-6v7td") pod "b0de6056-30e4-4acc-bfbc-199c27065ff6" (UID: "b0de6056-30e4-4acc-bfbc-199c27065ff6"). InnerVolumeSpecName "kube-api-access-6v7td". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.664130 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-config" (OuterVolumeSpecName: "config") pod "b0de6056-30e4-4acc-bfbc-199c27065ff6" (UID: "b0de6056-30e4-4acc-bfbc-199c27065ff6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.668840 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b0de6056-30e4-4acc-bfbc-199c27065ff6" (UID: "b0de6056-30e4-4acc-bfbc-199c27065ff6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.684045 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b0de6056-30e4-4acc-bfbc-199c27065ff6" (UID: "b0de6056-30e4-4acc-bfbc-199c27065ff6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.695229 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b0de6056-30e4-4acc-bfbc-199c27065ff6" (UID: "b0de6056-30e4-4acc-bfbc-199c27065ff6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.705393 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b0de6056-30e4-4acc-bfbc-199c27065ff6" (UID: "b0de6056-30e4-4acc-bfbc-199c27065ff6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.719792 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:44:36 crc kubenswrapper[4594]: W1129 05:44:36.726611 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7abfaea_4592_4e64_bcbc_5fa0331587a8.slice/crio-c76e9cc7836bb036f6c1a4e5fabc0cc8c3b8e1c6fb6cf668de4297cfdc868695 WatchSource:0}: Error finding container c76e9cc7836bb036f6c1a4e5fabc0cc8c3b8e1c6fb6cf668de4297cfdc868695: Status 404 returned error can't find the container with id c76e9cc7836bb036f6c1a4e5fabc0cc8c3b8e1c6fb6cf668de4297cfdc868695 Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.749447 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.749475 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v7td\" (UniqueName: \"kubernetes.io/projected/b0de6056-30e4-4acc-bfbc-199c27065ff6-kube-api-access-6v7td\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.749488 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.749497 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.749506 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.749514 4594 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0de6056-30e4-4acc-bfbc-199c27065ff6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.753789 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" event={"ID":"3536563a-1e9e-458e-ace5-acbe9e090404","Type":"ContainerStarted","Data":"8ec6438c2adbca016dc01629766d1303cc0f931ddf942b6080e48699d5838efa"} Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.753842 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" event={"ID":"3536563a-1e9e-458e-ace5-acbe9e090404","Type":"ContainerStarted","Data":"4647490bcb1fe66d1bf64b2c7f0d79c59d2fcbe98515b2d752767c1691db3797"} Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.759699 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d7abfaea-4592-4e64-bcbc-5fa0331587a8","Type":"ContainerStarted","Data":"c76e9cc7836bb036f6c1a4e5fabc0cc8c3b8e1c6fb6cf668de4297cfdc868695"} Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.766080 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cf4cc7b4f-hg8dz" event={"ID":"9ac3735b-d8a9-4723-8576-dc16c7af5756","Type":"ContainerStarted","Data":"bbb875a88884e4d67cb7992573aef55f60d72dbd03fcd968d00f19a3d3da277e"} Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.778763 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578598f949-zv6mw" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.779153 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578598f949-zv6mw" event={"ID":"b0de6056-30e4-4acc-bfbc-199c27065ff6","Type":"ContainerDied","Data":"2420748f7578897c79ae18d52bd25430a7e3e5b61773d8f9c23d1a9bdab796b3"} Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.779226 4594 scope.go:117] "RemoveContainer" containerID="abd06106b64c817022fb5eb3a9edfe51ad0a79acd3cff4fc1da4fcc3581df70e" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.783352 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.783496 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bbf48b7f-xvwrv" event={"ID":"6c5f9dc4-23c7-4223-8c63-44a15905edf5","Type":"ContainerDied","Data":"337f8b76f90f6ae57e4dba84110cedf977c45360b5e061b6d1be9b468dc4d9df"} Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.895840 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578598f949-zv6mw"] Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.911020 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-578598f949-zv6mw"] Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.913741 4594 scope.go:117] "RemoveContainer" containerID="d6bedcedf64006c5845032e8eb9885ce6a2a466ae24a3a81ac1e1321871a3e2d" Nov 29 05:44:36 crc kubenswrapper[4594]: W1129 05:44:36.937263 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb204dff1_e3d8_467f_b95f_372c0abbb1de.slice/crio-0bbae0e131888cf6e6ed7ec8293dc4a7bf442330c68e4fa02923f67f776d0088 WatchSource:0}: Error finding container 0bbae0e131888cf6e6ed7ec8293dc4a7bf442330c68e4fa02923f67f776d0088: Status 404 returned error can't find the container with id 0bbae0e131888cf6e6ed7ec8293dc4a7bf442330c68e4fa02923f67f776d0088 Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.969830 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bbf48b7f-xvwrv"] Nov 29 05:44:36 crc kubenswrapper[4594]: I1129 05:44:36.999675 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58bbf48b7f-xvwrv"] Nov 29 05:44:37 crc kubenswrapper[4594]: I1129 05:44:37.020926 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:44:37 crc kubenswrapper[4594]: I1129 05:44:37.836141 4594 generic.go:334] "Generic (PLEG): container finished" podID="3536563a-1e9e-458e-ace5-acbe9e090404" containerID="8ec6438c2adbca016dc01629766d1303cc0f931ddf942b6080e48699d5838efa" exitCode=0 Nov 29 05:44:37 crc kubenswrapper[4594]: I1129 05:44:37.837396 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" event={"ID":"3536563a-1e9e-458e-ace5-acbe9e090404","Type":"ContainerDied","Data":"8ec6438c2adbca016dc01629766d1303cc0f931ddf942b6080e48699d5838efa"} Nov 29 05:44:37 crc kubenswrapper[4594]: I1129 05:44:37.852531 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d7abfaea-4592-4e64-bcbc-5fa0331587a8","Type":"ContainerStarted","Data":"b030cad5054f07cf573e32c26cb8febee42aa8b7d06fdec89c7ef15bb8ab4474"} Nov 29 05:44:37 crc kubenswrapper[4594]: I1129 05:44:37.866173 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b204dff1-e3d8-467f-b95f-372c0abbb1de","Type":"ContainerStarted","Data":"0bbae0e131888cf6e6ed7ec8293dc4a7bf442330c68e4fa02923f67f776d0088"} Nov 29 05:44:38 crc kubenswrapper[4594]: I1129 05:44:38.128173 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c5f9dc4-23c7-4223-8c63-44a15905edf5" path="/var/lib/kubelet/pods/6c5f9dc4-23c7-4223-8c63-44a15905edf5/volumes" Nov 29 05:44:38 crc kubenswrapper[4594]: I1129 05:44:38.128995 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0de6056-30e4-4acc-bfbc-199c27065ff6" path="/var/lib/kubelet/pods/b0de6056-30e4-4acc-bfbc-199c27065ff6/volumes" Nov 29 05:44:38 crc kubenswrapper[4594]: I1129 05:44:38.883521 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d7abfaea-4592-4e64-bcbc-5fa0331587a8","Type":"ContainerStarted","Data":"7b64eb26edf2e1f199277eafed0d901a457c9203a5c6a9996c35e7ec77b12704"} Nov 29 05:44:38 crc kubenswrapper[4594]: I1129 05:44:38.890130 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b204dff1-e3d8-467f-b95f-372c0abbb1de","Type":"ContainerStarted","Data":"0ca4986760b3a8419ae6b79414ba7dd532de677c2b5b56b8bd320d52fa808769"} Nov 29 05:44:38 crc kubenswrapper[4594]: I1129 05:44:38.892886 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" event={"ID":"3536563a-1e9e-458e-ace5-acbe9e090404","Type":"ContainerStarted","Data":"a864656a788effdf9b2da5c2af82abd08d8dfc00247e49a48b84b882074bec97"} Nov 29 05:44:38 crc kubenswrapper[4594]: I1129 05:44:38.894481 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:38 crc kubenswrapper[4594]: I1129 05:44:38.932238 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.932214317 podStartE2EDuration="3.932214317s" podCreationTimestamp="2025-11-29 05:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:44:38.923872925 +0000 UTC m=+1003.164382145" watchObservedRunningTime="2025-11-29 05:44:38.932214317 +0000 UTC m=+1003.172723537" Nov 29 05:44:38 crc kubenswrapper[4594]: I1129 05:44:38.945047 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" podStartSLOduration=4.945030667 podStartE2EDuration="4.945030667s" podCreationTimestamp="2025-11-29 05:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:44:38.941129448 +0000 UTC m=+1003.181638668" watchObservedRunningTime="2025-11-29 05:44:38.945030667 +0000 UTC m=+1003.185539887" Nov 29 05:44:39 crc kubenswrapper[4594]: I1129 05:44:39.907029 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b204dff1-e3d8-467f-b95f-372c0abbb1de","Type":"ContainerStarted","Data":"712a83c1e61460790ca8a7f445e196684c977353eb35973b0848a93715bc43e5"} Nov 29 05:44:39 crc kubenswrapper[4594]: I1129 05:44:39.937454 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.937436094 podStartE2EDuration="4.937436094s" podCreationTimestamp="2025-11-29 05:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:44:39.931640882 +0000 UTC m=+1004.172150112" watchObservedRunningTime="2025-11-29 05:44:39.937436094 +0000 UTC m=+1004.177945314" Nov 29 05:44:40 crc kubenswrapper[4594]: I1129 05:44:40.922661 4594 generic.go:334] "Generic (PLEG): container finished" podID="fc29fae8-4b90-4702-8c68-4cec263fd4c9" containerID="5bef5fd16ac3e60fef2c472e9c97a102398cb34978c0ec31423c236aaa1eca2b" exitCode=0 Nov 29 05:44:40 crc kubenswrapper[4594]: I1129 05:44:40.922877 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-749tk" event={"ID":"fc29fae8-4b90-4702-8c68-4cec263fd4c9","Type":"ContainerDied","Data":"5bef5fd16ac3e60fef2c472e9c97a102398cb34978c0ec31423c236aaa1eca2b"} Nov 29 05:44:44 crc kubenswrapper[4594]: I1129 05:44:44.977517 4594 generic.go:334] "Generic (PLEG): container finished" podID="968317f6-c4d9-4647-a166-0cadc0fa57f2" containerID="fa4d2c563b6a5c3ba1aea016672f64a1203e1046e169fdafa873d6718fac82e8" exitCode=0 Nov 29 05:44:44 crc kubenswrapper[4594]: I1129 05:44:44.977651 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7b4h2" event={"ID":"968317f6-c4d9-4647-a166-0cadc0fa57f2","Type":"ContainerDied","Data":"fa4d2c563b6a5c3ba1aea016672f64a1203e1046e169fdafa873d6718fac82e8"} Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.200470 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.273609 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-72vg5"] Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.273889 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" podUID="ff6b46fc-e420-41b4-b2f1-06ff16d73595" containerName="dnsmasq-dns" containerID="cri-o://5c4fa1739f79e3406180b7ae3f1b98d8a8d214f5d1ba328a9114a9710ee6e324" gracePeriod=10 Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.328359 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.328702 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b204dff1-e3d8-467f-b95f-372c0abbb1de" containerName="glance-log" containerID="cri-o://0ca4986760b3a8419ae6b79414ba7dd532de677c2b5b56b8bd320d52fa808769" gracePeriod=30 Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.329040 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b204dff1-e3d8-467f-b95f-372c0abbb1de" containerName="glance-httpd" containerID="cri-o://712a83c1e61460790ca8a7f445e196684c977353eb35973b0848a93715bc43e5" gracePeriod=30 Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.405131 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.405754 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d7abfaea-4592-4e64-bcbc-5fa0331587a8" containerName="glance-log" containerID="cri-o://b030cad5054f07cf573e32c26cb8febee42aa8b7d06fdec89c7ef15bb8ab4474" gracePeriod=30 Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.405879 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d7abfaea-4592-4e64-bcbc-5fa0331587a8" containerName="glance-httpd" containerID="cri-o://7b64eb26edf2e1f199277eafed0d901a457c9203a5c6a9996c35e7ec77b12704" gracePeriod=30 Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.800408 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.800473 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.808406 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bf6f9df7f-knsk7"] Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.841111 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7fb8849f48-9rr52"] Nov 29 05:44:45 crc kubenswrapper[4594]: E1129 05:44:45.841633 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0de6056-30e4-4acc-bfbc-199c27065ff6" containerName="init" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.841655 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0de6056-30e4-4acc-bfbc-199c27065ff6" containerName="init" Nov 29 05:44:45 crc kubenswrapper[4594]: E1129 05:44:45.841684 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5f9dc4-23c7-4223-8c63-44a15905edf5" containerName="init" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.841693 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5f9dc4-23c7-4223-8c63-44a15905edf5" containerName="init" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.841907 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0de6056-30e4-4acc-bfbc-199c27065ff6" containerName="init" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.841928 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5f9dc4-23c7-4223-8c63-44a15905edf5" containerName="init" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.842975 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.847702 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.852386 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fb8849f48-9rr52"] Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.868535 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-horizon-secret-key\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.868585 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-horizon-tls-certs\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.868611 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8d01e45-1d76-464f-93e6-965d65a055fc-config-data\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.868690 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8d01e45-1d76-464f-93e6-965d65a055fc-scripts\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.868745 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-combined-ca-bundle\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.868820 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7qgf\" (UniqueName: \"kubernetes.io/projected/e8d01e45-1d76-464f-93e6-965d65a055fc-kube-api-access-l7qgf\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.868884 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8d01e45-1d76-464f-93e6-965d65a055fc-logs\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.909037 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cf4cc7b4f-hg8dz"] Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.921447 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-558d4b85cb-k5j98"] Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.923168 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.938604 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-558d4b85cb-k5j98"] Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.970659 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7qgf\" (UniqueName: \"kubernetes.io/projected/e8d01e45-1d76-464f-93e6-965d65a055fc-kube-api-access-l7qgf\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.970740 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8d01e45-1d76-464f-93e6-965d65a055fc-logs\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.970767 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19dfde1f-d770-45ec-8735-78549b8fcb90-logs\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.970806 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-horizon-secret-key\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.970827 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19dfde1f-d770-45ec-8735-78549b8fcb90-horizon-secret-key\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.970846 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-horizon-tls-certs\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.970862 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8d01e45-1d76-464f-93e6-965d65a055fc-config-data\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.970889 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5r4b\" (UniqueName: \"kubernetes.io/projected/19dfde1f-d770-45ec-8735-78549b8fcb90-kube-api-access-h5r4b\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.970924 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8d01e45-1d76-464f-93e6-965d65a055fc-scripts\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.970943 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19dfde1f-d770-45ec-8735-78549b8fcb90-scripts\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.970970 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/19dfde1f-d770-45ec-8735-78549b8fcb90-horizon-tls-certs\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.970995 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-combined-ca-bundle\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.971016 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19dfde1f-d770-45ec-8735-78549b8fcb90-config-data\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.971034 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19dfde1f-d770-45ec-8735-78549b8fcb90-combined-ca-bundle\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.971524 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8d01e45-1d76-464f-93e6-965d65a055fc-logs\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.972040 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8d01e45-1d76-464f-93e6-965d65a055fc-scripts\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.975718 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8d01e45-1d76-464f-93e6-965d65a055fc-config-data\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.978830 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-combined-ca-bundle\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.979206 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-horizon-secret-key\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.980856 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-horizon-tls-certs\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:45 crc kubenswrapper[4594]: I1129 05:44:45.990364 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7qgf\" (UniqueName: \"kubernetes.io/projected/e8d01e45-1d76-464f-93e6-965d65a055fc-kube-api-access-l7qgf\") pod \"horizon-7fb8849f48-9rr52\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.003681 4594 generic.go:334] "Generic (PLEG): container finished" podID="ff6b46fc-e420-41b4-b2f1-06ff16d73595" containerID="5c4fa1739f79e3406180b7ae3f1b98d8a8d214f5d1ba328a9114a9710ee6e324" exitCode=0 Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.003822 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" event={"ID":"ff6b46fc-e420-41b4-b2f1-06ff16d73595","Type":"ContainerDied","Data":"5c4fa1739f79e3406180b7ae3f1b98d8a8d214f5d1ba328a9114a9710ee6e324"} Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.009625 4594 generic.go:334] "Generic (PLEG): container finished" podID="d7abfaea-4592-4e64-bcbc-5fa0331587a8" containerID="b030cad5054f07cf573e32c26cb8febee42aa8b7d06fdec89c7ef15bb8ab4474" exitCode=143 Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.009704 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d7abfaea-4592-4e64-bcbc-5fa0331587a8","Type":"ContainerDied","Data":"b030cad5054f07cf573e32c26cb8febee42aa8b7d06fdec89c7ef15bb8ab4474"} Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.013431 4594 generic.go:334] "Generic (PLEG): container finished" podID="b204dff1-e3d8-467f-b95f-372c0abbb1de" containerID="712a83c1e61460790ca8a7f445e196684c977353eb35973b0848a93715bc43e5" exitCode=0 Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.013459 4594 generic.go:334] "Generic (PLEG): container finished" podID="b204dff1-e3d8-467f-b95f-372c0abbb1de" containerID="0ca4986760b3a8419ae6b79414ba7dd532de677c2b5b56b8bd320d52fa808769" exitCode=143 Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.013635 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b204dff1-e3d8-467f-b95f-372c0abbb1de","Type":"ContainerDied","Data":"712a83c1e61460790ca8a7f445e196684c977353eb35973b0848a93715bc43e5"} Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.013680 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b204dff1-e3d8-467f-b95f-372c0abbb1de","Type":"ContainerDied","Data":"0ca4986760b3a8419ae6b79414ba7dd532de677c2b5b56b8bd320d52fa808769"} Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.072656 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19dfde1f-d770-45ec-8735-78549b8fcb90-combined-ca-bundle\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.072756 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19dfde1f-d770-45ec-8735-78549b8fcb90-logs\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.072802 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19dfde1f-d770-45ec-8735-78549b8fcb90-horizon-secret-key\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.072837 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5r4b\" (UniqueName: \"kubernetes.io/projected/19dfde1f-d770-45ec-8735-78549b8fcb90-kube-api-access-h5r4b\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.072874 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19dfde1f-d770-45ec-8735-78549b8fcb90-scripts\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.072904 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/19dfde1f-d770-45ec-8735-78549b8fcb90-horizon-tls-certs\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.072939 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19dfde1f-d770-45ec-8735-78549b8fcb90-config-data\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.074839 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19dfde1f-d770-45ec-8735-78549b8fcb90-config-data\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.075290 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19dfde1f-d770-45ec-8735-78549b8fcb90-logs\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.075877 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19dfde1f-d770-45ec-8735-78549b8fcb90-scripts\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.081801 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19dfde1f-d770-45ec-8735-78549b8fcb90-combined-ca-bundle\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.082211 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/19dfde1f-d770-45ec-8735-78549b8fcb90-horizon-tls-certs\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.083646 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19dfde1f-d770-45ec-8735-78549b8fcb90-horizon-secret-key\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.092420 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5r4b\" (UniqueName: \"kubernetes.io/projected/19dfde1f-d770-45ec-8735-78549b8fcb90-kube-api-access-h5r4b\") pod \"horizon-558d4b85cb-k5j98\" (UID: \"19dfde1f-d770-45ec-8735-78549b8fcb90\") " pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.166717 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:44:46 crc kubenswrapper[4594]: I1129 05:44:46.242433 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:44:47 crc kubenswrapper[4594]: I1129 05:44:47.031921 4594 generic.go:334] "Generic (PLEG): container finished" podID="d7abfaea-4592-4e64-bcbc-5fa0331587a8" containerID="7b64eb26edf2e1f199277eafed0d901a457c9203a5c6a9996c35e7ec77b12704" exitCode=0 Nov 29 05:44:47 crc kubenswrapper[4594]: I1129 05:44:47.031974 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d7abfaea-4592-4e64-bcbc-5fa0331587a8","Type":"ContainerDied","Data":"7b64eb26edf2e1f199277eafed0d901a457c9203a5c6a9996c35e7ec77b12704"} Nov 29 05:44:50 crc kubenswrapper[4594]: I1129 05:44:50.248036 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" podUID="ff6b46fc-e420-41b4-b2f1-06ff16d73595" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: connect: connection refused" Nov 29 05:44:51 crc kubenswrapper[4594]: E1129 05:44:51.007781 4594 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-placement-api:current" Nov 29 05:44:51 crc kubenswrapper[4594]: E1129 05:44:51.008082 4594 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-placement-api:current" Nov 29 05:44:51 crc kubenswrapper[4594]: E1129 05:44:51.008232 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-placement-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wk2n4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-8jkv9_openstack(69498abf-8b80-4b7f-901a-4c6a4bdede2f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 05:44:51 crc kubenswrapper[4594]: E1129 05:44:51.009423 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-8jkv9" podUID="69498abf-8b80-4b7f-901a-4c6a4bdede2f" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.073772 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-749tk" event={"ID":"fc29fae8-4b90-4702-8c68-4cec263fd4c9","Type":"ContainerDied","Data":"b91318f59f3088d4e9b1654661b4c8d2c69090ccb915c08b7fccaebd4de83d4c"} Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.073818 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b91318f59f3088d4e9b1654661b4c8d2c69090ccb915c08b7fccaebd4de83d4c" Nov 29 05:44:51 crc kubenswrapper[4594]: E1129 05:44:51.081044 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-placement-api:current\\\"\"" pod="openstack/placement-db-sync-8jkv9" podUID="69498abf-8b80-4b7f-901a-4c6a4bdede2f" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.095029 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.276869 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-credential-keys\") pod \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.277013 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-combined-ca-bundle\") pod \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.277100 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-scripts\") pod \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.277182 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-config-data\") pod \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.277292 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-fernet-keys\") pod \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.277324 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsjmh\" (UniqueName: \"kubernetes.io/projected/fc29fae8-4b90-4702-8c68-4cec263fd4c9-kube-api-access-zsjmh\") pod \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\" (UID: \"fc29fae8-4b90-4702-8c68-4cec263fd4c9\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.283918 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fc29fae8-4b90-4702-8c68-4cec263fd4c9" (UID: "fc29fae8-4b90-4702-8c68-4cec263fd4c9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.283948 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-scripts" (OuterVolumeSpecName: "scripts") pod "fc29fae8-4b90-4702-8c68-4cec263fd4c9" (UID: "fc29fae8-4b90-4702-8c68-4cec263fd4c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.284373 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fc29fae8-4b90-4702-8c68-4cec263fd4c9" (UID: "fc29fae8-4b90-4702-8c68-4cec263fd4c9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.284405 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc29fae8-4b90-4702-8c68-4cec263fd4c9-kube-api-access-zsjmh" (OuterVolumeSpecName: "kube-api-access-zsjmh") pod "fc29fae8-4b90-4702-8c68-4cec263fd4c9" (UID: "fc29fae8-4b90-4702-8c68-4cec263fd4c9"). InnerVolumeSpecName "kube-api-access-zsjmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.302813 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc29fae8-4b90-4702-8c68-4cec263fd4c9" (UID: "fc29fae8-4b90-4702-8c68-4cec263fd4c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.304903 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-config-data" (OuterVolumeSpecName: "config-data") pod "fc29fae8-4b90-4702-8c68-4cec263fd4c9" (UID: "fc29fae8-4b90-4702-8c68-4cec263fd4c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.380516 4594 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.380544 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.380557 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.380566 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.380574 4594 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc29fae8-4b90-4702-8c68-4cec263fd4c9-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.380584 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsjmh\" (UniqueName: \"kubernetes.io/projected/fc29fae8-4b90-4702-8c68-4cec263fd4c9-kube-api-access-zsjmh\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:51 crc kubenswrapper[4594]: E1129 05:44:51.463970 4594 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current" Nov 29 05:44:51 crc kubenswrapper[4594]: E1129 05:44:51.464043 4594 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current" Nov 29 05:44:51 crc kubenswrapper[4594]: E1129 05:44:51.464342 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gdcpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-mvvfl_openstack(72edd67c-f301-4e33-8d71-74f9bc7b99c5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 05:44:51 crc kubenswrapper[4594]: E1129 05:44:51.465546 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-mvvfl" podUID="72edd67c-f301-4e33-8d71-74f9bc7b99c5" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.515013 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.686917 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp9pp\" (UniqueName: \"kubernetes.io/projected/968317f6-c4d9-4647-a166-0cadc0fa57f2-kube-api-access-tp9pp\") pod \"968317f6-c4d9-4647-a166-0cadc0fa57f2\" (UID: \"968317f6-c4d9-4647-a166-0cadc0fa57f2\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.687213 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/968317f6-c4d9-4647-a166-0cadc0fa57f2-config\") pod \"968317f6-c4d9-4647-a166-0cadc0fa57f2\" (UID: \"968317f6-c4d9-4647-a166-0cadc0fa57f2\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.687382 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968317f6-c4d9-4647-a166-0cadc0fa57f2-combined-ca-bundle\") pod \"968317f6-c4d9-4647-a166-0cadc0fa57f2\" (UID: \"968317f6-c4d9-4647-a166-0cadc0fa57f2\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.692920 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/968317f6-c4d9-4647-a166-0cadc0fa57f2-kube-api-access-tp9pp" (OuterVolumeSpecName: "kube-api-access-tp9pp") pod "968317f6-c4d9-4647-a166-0cadc0fa57f2" (UID: "968317f6-c4d9-4647-a166-0cadc0fa57f2"). InnerVolumeSpecName "kube-api-access-tp9pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.721464 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968317f6-c4d9-4647-a166-0cadc0fa57f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "968317f6-c4d9-4647-a166-0cadc0fa57f2" (UID: "968317f6-c4d9-4647-a166-0cadc0fa57f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.724599 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968317f6-c4d9-4647-a166-0cadc0fa57f2-config" (OuterVolumeSpecName: "config") pod "968317f6-c4d9-4647-a166-0cadc0fa57f2" (UID: "968317f6-c4d9-4647-a166-0cadc0fa57f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.789535 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968317f6-c4d9-4647-a166-0cadc0fa57f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.789560 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp9pp\" (UniqueName: \"kubernetes.io/projected/968317f6-c4d9-4647-a166-0cadc0fa57f2-kube-api-access-tp9pp\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.789588 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/968317f6-c4d9-4647-a166-0cadc0fa57f2-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.899197 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.939510 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.993730 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.998347 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.998407 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-config-data\") pod \"b204dff1-e3d8-467f-b95f-372c0abbb1de\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.998457 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"b204dff1-e3d8-467f-b95f-372c0abbb1de\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.998510 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-ovsdbserver-nb\") pod \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.998593 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnw98\" (UniqueName: \"kubernetes.io/projected/ff6b46fc-e420-41b4-b2f1-06ff16d73595-kube-api-access-wnw98\") pod \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.998617 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-ovsdbserver-sb\") pod \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.998646 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b204dff1-e3d8-467f-b95f-372c0abbb1de-logs\") pod \"b204dff1-e3d8-467f-b95f-372c0abbb1de\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.998665 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-config\") pod \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.998727 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-scripts\") pod \"b204dff1-e3d8-467f-b95f-372c0abbb1de\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.998755 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7abfaea-4592-4e64-bcbc-5fa0331587a8-logs\") pod \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.998777 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-dns-svc\") pod \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.998808 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-combined-ca-bundle\") pod \"b204dff1-e3d8-467f-b95f-372c0abbb1de\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.998844 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4qw9\" (UniqueName: \"kubernetes.io/projected/b204dff1-e3d8-467f-b95f-372c0abbb1de-kube-api-access-n4qw9\") pod \"b204dff1-e3d8-467f-b95f-372c0abbb1de\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.999285 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b204dff1-e3d8-467f-b95f-372c0abbb1de-logs" (OuterVolumeSpecName: "logs") pod "b204dff1-e3d8-467f-b95f-372c0abbb1de" (UID: "b204dff1-e3d8-467f-b95f-372c0abbb1de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.999548 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-scripts\") pod \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.999580 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-dns-swift-storage-0\") pod \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\" (UID: \"ff6b46fc-e420-41b4-b2f1-06ff16d73595\") " Nov 29 05:44:51 crc kubenswrapper[4594]: I1129 05:44:51.999599 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-config-data\") pod \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.000416 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b204dff1-e3d8-467f-b95f-372c0abbb1de-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.005050 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7abfaea-4592-4e64-bcbc-5fa0331587a8-logs" (OuterVolumeSpecName: "logs") pod "d7abfaea-4592-4e64-bcbc-5fa0331587a8" (UID: "d7abfaea-4592-4e64-bcbc-5fa0331587a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.034688 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "b204dff1-e3d8-467f-b95f-372c0abbb1de" (UID: "b204dff1-e3d8-467f-b95f-372c0abbb1de"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.040660 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "d7abfaea-4592-4e64-bcbc-5fa0331587a8" (UID: "d7abfaea-4592-4e64-bcbc-5fa0331587a8"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.042952 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff6b46fc-e420-41b4-b2f1-06ff16d73595-kube-api-access-wnw98" (OuterVolumeSpecName: "kube-api-access-wnw98") pod "ff6b46fc-e420-41b4-b2f1-06ff16d73595" (UID: "ff6b46fc-e420-41b4-b2f1-06ff16d73595"). InnerVolumeSpecName "kube-api-access-wnw98". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.044990 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-scripts" (OuterVolumeSpecName: "scripts") pod "d7abfaea-4592-4e64-bcbc-5fa0331587a8" (UID: "d7abfaea-4592-4e64-bcbc-5fa0331587a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.047638 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-scripts" (OuterVolumeSpecName: "scripts") pod "b204dff1-e3d8-467f-b95f-372c0abbb1de" (UID: "b204dff1-e3d8-467f-b95f-372c0abbb1de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.049418 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b204dff1-e3d8-467f-b95f-372c0abbb1de-kube-api-access-n4qw9" (OuterVolumeSpecName: "kube-api-access-n4qw9") pod "b204dff1-e3d8-467f-b95f-372c0abbb1de" (UID: "b204dff1-e3d8-467f-b95f-372c0abbb1de"). InnerVolumeSpecName "kube-api-access-n4qw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.085085 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.088645 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.090991 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.092405 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-749tk" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.092404 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:44:52 crc kubenswrapper[4594]: E1129 05:44:52.094199 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current\\\"\"" pod="openstack/barbican-db-sync-mvvfl" podUID="72edd67c-f301-4e33-8d71-74f9bc7b99c5" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.101556 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b204dff1-e3d8-467f-b95f-372c0abbb1de-httpd-run\") pod \"b204dff1-e3d8-467f-b95f-372c0abbb1de\" (UID: \"b204dff1-e3d8-467f-b95f-372c0abbb1de\") " Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.101628 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-combined-ca-bundle\") pod \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.101658 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckhmx\" (UniqueName: \"kubernetes.io/projected/d7abfaea-4592-4e64-bcbc-5fa0331587a8-kube-api-access-ckhmx\") pod \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.101688 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7abfaea-4592-4e64-bcbc-5fa0331587a8-httpd-run\") pod \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\" (UID: \"d7abfaea-4592-4e64-bcbc-5fa0331587a8\") " Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.101862 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b204dff1-e3d8-467f-b95f-372c0abbb1de-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b204dff1-e3d8-467f-b95f-372c0abbb1de" (UID: "b204dff1-e3d8-467f-b95f-372c0abbb1de"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.102500 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnw98\" (UniqueName: \"kubernetes.io/projected/ff6b46fc-e420-41b4-b2f1-06ff16d73595-kube-api-access-wnw98\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.102524 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.102535 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7abfaea-4592-4e64-bcbc-5fa0331587a8-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.102545 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4qw9\" (UniqueName: \"kubernetes.io/projected/b204dff1-e3d8-467f-b95f-372c0abbb1de-kube-api-access-n4qw9\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.102552 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.102562 4594 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b204dff1-e3d8-467f-b95f-372c0abbb1de-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.102584 4594 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.102596 4594 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.108018 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7abfaea-4592-4e64-bcbc-5fa0331587a8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d7abfaea-4592-4e64-bcbc-5fa0331587a8" (UID: "d7abfaea-4592-4e64-bcbc-5fa0331587a8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.146343 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7abfaea-4592-4e64-bcbc-5fa0331587a8-kube-api-access-ckhmx" (OuterVolumeSpecName: "kube-api-access-ckhmx") pod "d7abfaea-4592-4e64-bcbc-5fa0331587a8" (UID: "d7abfaea-4592-4e64-bcbc-5fa0331587a8"). InnerVolumeSpecName "kube-api-access-ckhmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.211023 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckhmx\" (UniqueName: \"kubernetes.io/projected/d7abfaea-4592-4e64-bcbc-5fa0331587a8-kube-api-access-ckhmx\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.211363 4594 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7abfaea-4592-4e64-bcbc-5fa0331587a8-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.244711 4594 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.320021 4594 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: W1129 05:44:52.372559 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19dfde1f_d770_45ec_8735_78549b8fcb90.slice/crio-db9f8bd15a5d52b0f7ae29465c7098ef2cba54adf91ddbe0733aaa0be54756ed WatchSource:0}: Error finding container db9f8bd15a5d52b0f7ae29465c7098ef2cba54adf91ddbe0733aaa0be54756ed: Status 404 returned error can't find the container with id db9f8bd15a5d52b0f7ae29465c7098ef2cba54adf91ddbe0733aaa0be54756ed Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.529104 4594 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.613609 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff6b46fc-e420-41b4-b2f1-06ff16d73595" (UID: "ff6b46fc-e420-41b4-b2f1-06ff16d73595"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.619963 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b204dff1-e3d8-467f-b95f-372c0abbb1de" (UID: "b204dff1-e3d8-467f-b95f-372c0abbb1de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.629691 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7abfaea-4592-4e64-bcbc-5fa0331587a8" (UID: "d7abfaea-4592-4e64-bcbc-5fa0331587a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.630918 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.630958 4594 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.630968 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.630977 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.633552 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ff6b46fc-e420-41b4-b2f1-06ff16d73595" (UID: "ff6b46fc-e420-41b4-b2f1-06ff16d73595"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.642600 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-config" (OuterVolumeSpecName: "config") pod "ff6b46fc-e420-41b4-b2f1-06ff16d73595" (UID: "ff6b46fc-e420-41b4-b2f1-06ff16d73595"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.647576 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff6b46fc-e420-41b4-b2f1-06ff16d73595" (UID: "ff6b46fc-e420-41b4-b2f1-06ff16d73595"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.655590 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff6b46fc-e420-41b4-b2f1-06ff16d73595" (UID: "ff6b46fc-e420-41b4-b2f1-06ff16d73595"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.660201 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-config-data" (OuterVolumeSpecName: "config-data") pod "d7abfaea-4592-4e64-bcbc-5fa0331587a8" (UID: "d7abfaea-4592-4e64-bcbc-5fa0331587a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.664621 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-config-data" (OuterVolumeSpecName: "config-data") pod "b204dff1-e3d8-467f-b95f-372c0abbb1de" (UID: "b204dff1-e3d8-467f-b95f-372c0abbb1de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.729029 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fb8849f48-9rr52"] Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.729063 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-72vg5" event={"ID":"ff6b46fc-e420-41b4-b2f1-06ff16d73595","Type":"ContainerDied","Data":"e108da01ae5da191e328593e817fd401637521988ea5a13bb9b20b18cdf4ad1a"} Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.729089 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d7abfaea-4592-4e64-bcbc-5fa0331587a8","Type":"ContainerDied","Data":"c76e9cc7836bb036f6c1a4e5fabc0cc8c3b8e1c6fb6cf668de4297cfdc868695"} Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.729105 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b204dff1-e3d8-467f-b95f-372c0abbb1de","Type":"ContainerDied","Data":"0bbae0e131888cf6e6ed7ec8293dc4a7bf442330c68e4fa02923f67f776d0088"} Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.729118 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7b4h2" event={"ID":"968317f6-c4d9-4647-a166-0cadc0fa57f2","Type":"ContainerDied","Data":"963d0315079b92d420e2feaa6f31a0549cd2377f92c8e120cf226885fc66651a"} Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.729134 4594 scope.go:117] "RemoveContainer" containerID="5c4fa1739f79e3406180b7ae3f1b98d8a8d214f5d1ba328a9114a9710ee6e324" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.729543 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="963d0315079b92d420e2feaa6f31a0549cd2377f92c8e120cf226885fc66651a" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.729583 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-749tk"] Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.729604 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-749tk"] Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.729622 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gkrjf"] Nov 29 05:44:52 crc kubenswrapper[4594]: E1129 05:44:52.729934 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7abfaea-4592-4e64-bcbc-5fa0331587a8" containerName="glance-httpd" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.729952 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7abfaea-4592-4e64-bcbc-5fa0331587a8" containerName="glance-httpd" Nov 29 05:44:52 crc kubenswrapper[4594]: E1129 05:44:52.729965 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc29fae8-4b90-4702-8c68-4cec263fd4c9" containerName="keystone-bootstrap" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.729972 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc29fae8-4b90-4702-8c68-4cec263fd4c9" containerName="keystone-bootstrap" Nov 29 05:44:52 crc kubenswrapper[4594]: E1129 05:44:52.729982 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="968317f6-c4d9-4647-a166-0cadc0fa57f2" containerName="neutron-db-sync" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.729990 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="968317f6-c4d9-4647-a166-0cadc0fa57f2" containerName="neutron-db-sync" Nov 29 05:44:52 crc kubenswrapper[4594]: E1129 05:44:52.730005 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b204dff1-e3d8-467f-b95f-372c0abbb1de" containerName="glance-httpd" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730011 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="b204dff1-e3d8-467f-b95f-372c0abbb1de" containerName="glance-httpd" Nov 29 05:44:52 crc kubenswrapper[4594]: E1129 05:44:52.730020 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7abfaea-4592-4e64-bcbc-5fa0331587a8" containerName="glance-log" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730026 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7abfaea-4592-4e64-bcbc-5fa0331587a8" containerName="glance-log" Nov 29 05:44:52 crc kubenswrapper[4594]: E1129 05:44:52.730040 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6b46fc-e420-41b4-b2f1-06ff16d73595" containerName="init" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730048 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6b46fc-e420-41b4-b2f1-06ff16d73595" containerName="init" Nov 29 05:44:52 crc kubenswrapper[4594]: E1129 05:44:52.730058 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b204dff1-e3d8-467f-b95f-372c0abbb1de" containerName="glance-log" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730063 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="b204dff1-e3d8-467f-b95f-372c0abbb1de" containerName="glance-log" Nov 29 05:44:52 crc kubenswrapper[4594]: E1129 05:44:52.730073 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6b46fc-e420-41b4-b2f1-06ff16d73595" containerName="dnsmasq-dns" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730078 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6b46fc-e420-41b4-b2f1-06ff16d73595" containerName="dnsmasq-dns" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730235 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7abfaea-4592-4e64-bcbc-5fa0331587a8" containerName="glance-httpd" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730246 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="b204dff1-e3d8-467f-b95f-372c0abbb1de" containerName="glance-httpd" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730278 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7abfaea-4592-4e64-bcbc-5fa0331587a8" containerName="glance-log" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730292 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="b204dff1-e3d8-467f-b95f-372c0abbb1de" containerName="glance-log" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730306 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff6b46fc-e420-41b4-b2f1-06ff16d73595" containerName="dnsmasq-dns" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730316 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="968317f6-c4d9-4647-a166-0cadc0fa57f2" containerName="neutron-db-sync" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730327 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc29fae8-4b90-4702-8c68-4cec263fd4c9" containerName="keystone-bootstrap" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730893 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gkrjf"] Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730913 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-558d4b85cb-k5j98"] Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.730992 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.739797 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.739985 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.740087 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.741795 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.742829 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nz5kq" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.745158 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b204dff1-e3d8-467f-b95f-372c0abbb1de-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.745182 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.745191 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.745200 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.745208 4594 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff6b46fc-e420-41b4-b2f1-06ff16d73595-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.745218 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7abfaea-4592-4e64-bcbc-5fa0331587a8-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.774498 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b86667d5-d25cf"] Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.776594 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.804170 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b86667d5-d25cf"] Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.848189 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66bbc5d8dd-fzbhl"] Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.849770 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qv4b\" (UniqueName: \"kubernetes.io/projected/d7cc6b30-a981-4486-9d8a-e926167f001b-kube-api-access-7qv4b\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.849805 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-fernet-keys\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.849836 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-config\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.849876 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-combined-ca-bundle\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.849891 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-dns-swift-storage-0\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.849929 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-ovsdbserver-sb\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.849962 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-credential-keys\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.849982 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-dns-svc\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.849997 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtgh7\" (UniqueName: \"kubernetes.io/projected/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-kube-api-access-mtgh7\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.850021 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-config-data\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.850059 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-scripts\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.850090 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-ovsdbserver-nb\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.855037 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.864746 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.865087 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.865217 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8hbsc" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.866336 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.882919 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66bbc5d8dd-fzbhl"] Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964173 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-ovsdbserver-sb\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964442 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-config\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964489 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-credential-keys\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964514 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-httpd-config\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964534 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-dns-svc\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964553 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtgh7\" (UniqueName: \"kubernetes.io/projected/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-kube-api-access-mtgh7\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964593 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-config-data\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964646 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-scripts\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964699 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-ovsdbserver-nb\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964716 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-combined-ca-bundle\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964755 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6zrw\" (UniqueName: \"kubernetes.io/projected/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-kube-api-access-x6zrw\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964817 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-ovndb-tls-certs\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964884 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qv4b\" (UniqueName: \"kubernetes.io/projected/d7cc6b30-a981-4486-9d8a-e926167f001b-kube-api-access-7qv4b\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964912 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-fernet-keys\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.964945 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-config\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.965006 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-combined-ca-bundle\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.965023 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-dns-swift-storage-0\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.965949 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-dns-swift-storage-0\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.968563 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-ovsdbserver-nb\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.969964 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-ovsdbserver-sb\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.971280 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-config\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.971506 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-dns-svc\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.974821 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-config-data\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.976837 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-combined-ca-bundle\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.987817 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-credential-keys\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.989391 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-scripts\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.991622 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtgh7\" (UniqueName: \"kubernetes.io/projected/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-kube-api-access-mtgh7\") pod \"dnsmasq-dns-67b86667d5-d25cf\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.993564 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-fernet-keys\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:52 crc kubenswrapper[4594]: I1129 05:44:52.995552 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qv4b\" (UniqueName: \"kubernetes.io/projected/d7cc6b30-a981-4486-9d8a-e926167f001b-kube-api-access-7qv4b\") pod \"keystone-bootstrap-gkrjf\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.003325 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-72vg5"] Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.035148 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-72vg5"] Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.068977 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.072236 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-httpd-config\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.072456 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-combined-ca-bundle\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.072505 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6zrw\" (UniqueName: \"kubernetes.io/projected/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-kube-api-access-x6zrw\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.072591 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-ovndb-tls-certs\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.072830 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-config\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.078621 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.081399 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.082672 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-config\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.083091 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-httpd-config\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.091416 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-ovndb-tls-certs\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.093134 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6zrw\" (UniqueName: \"kubernetes.io/projected/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-kube-api-access-x6zrw\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.094102 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-combined-ca-bundle\") pod \"neutron-66bbc5d8dd-fzbhl\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.094207 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.103977 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.106993 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.125703 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rtrkc" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.126022 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.126184 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.126874 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.136532 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-558d4b85cb-k5j98" event={"ID":"19dfde1f-d770-45ec-8735-78549b8fcb90","Type":"ContainerStarted","Data":"db9f8bd15a5d52b0f7ae29465c7098ef2cba54adf91ddbe0733aaa0be54756ed"} Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.149463 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfe84a63-fea5-455e-95d3-523a091b976f","Type":"ContainerStarted","Data":"8b910ba656191f153188fed85a2fe961876dddee8993298e48bca1621bd41c04"} Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.161524 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.176123 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-logs\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.176192 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.176217 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.176269 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-config-data\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.176311 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-scripts\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.176546 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.176598 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.176625 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blm49\" (UniqueName: \"kubernetes.io/projected/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-kube-api-access-blm49\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.178061 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-bg92f" event={"ID":"7474ba72-1534-4193-9590-e46dfc840403","Type":"ContainerStarted","Data":"420c1c005e74da6ed5d7566f41328c8fed77e3abf21045a59554138235cc373e"} Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.188426 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.200532 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.200651 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.206673 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.207774 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.208280 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.210133 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bf6f9df7f-knsk7" event={"ID":"9c4f4aa8-f25f-4684-8707-3bc0eb168954","Type":"ContainerStarted","Data":"c63f69a44e3afa1d1ac33136814d6b792d5e9273aa32f43e16c6771170da44e1"} Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.212887 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-bg92f" podStartSLOduration=3.18999153 podStartE2EDuration="45.212874346s" podCreationTimestamp="2025-11-29 05:44:08 +0000 UTC" firstStartedPulling="2025-11-29 05:44:09.616045005 +0000 UTC m=+973.856554225" lastFinishedPulling="2025-11-29 05:44:51.638927821 +0000 UTC m=+1015.879437041" observedRunningTime="2025-11-29 05:44:53.19614789 +0000 UTC m=+1017.436657110" watchObservedRunningTime="2025-11-29 05:44:53.212874346 +0000 UTC m=+1017.453383567" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.216705 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.213819 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cf4cc7b4f-hg8dz" event={"ID":"9ac3735b-d8a9-4723-8576-dc16c7af5756","Type":"ContainerStarted","Data":"cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7"} Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.223925 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.230079 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8849f48-9rr52" event={"ID":"e8d01e45-1d76-464f-93e6-965d65a055fc","Type":"ContainerStarted","Data":"cbea952840713cbdc824cef609595cd34ec189700a5184841e64c76a84b7656b"} Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.235052 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-658d8f8767-q5w5m" event={"ID":"e786ace8-0d46-488a-941c-2325002c5edc","Type":"ContainerStarted","Data":"5ebe1b1187197d5f07df1ccb9f78604659450fccdeb6a5cc9fda337a0746d245"} Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.279764 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.279859 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.279884 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.279912 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blm49\" (UniqueName: \"kubernetes.io/projected/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-kube-api-access-blm49\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.280824 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7r2\" (UniqueName: \"kubernetes.io/projected/075b6980-e8ff-4248-ad5e-fb33fd7199d3-kube-api-access-vl7r2\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.280934 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-logs\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.280958 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/075b6980-e8ff-4248-ad5e-fb33fd7199d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.281027 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.281043 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.281200 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.281306 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-config-data\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.281336 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/075b6980-e8ff-4248-ad5e-fb33fd7199d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.281356 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.281377 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.281447 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-scripts\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.282975 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.283508 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-logs\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.283754 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.283975 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.288105 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.288797 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.299953 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blm49\" (UniqueName: \"kubernetes.io/projected/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-kube-api-access-blm49\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.307788 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-scripts\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.308800 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-config-data\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.310540 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.386131 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.386208 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.386274 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.386371 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7r2\" (UniqueName: \"kubernetes.io/projected/075b6980-e8ff-4248-ad5e-fb33fd7199d3-kube-api-access-vl7r2\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.386409 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/075b6980-e8ff-4248-ad5e-fb33fd7199d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.386458 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.386497 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/075b6980-e8ff-4248-ad5e-fb33fd7199d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.386515 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.386779 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.388242 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/075b6980-e8ff-4248-ad5e-fb33fd7199d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.388422 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/075b6980-e8ff-4248-ad5e-fb33fd7199d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.410722 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.416753 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.419030 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.420236 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.422034 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7r2\" (UniqueName: \"kubernetes.io/projected/075b6980-e8ff-4248-ad5e-fb33fd7199d3-kube-api-access-vl7r2\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.434651 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.530194 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 05:44:53 crc kubenswrapper[4594]: I1129 05:44:53.535621 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 05:44:54 crc kubenswrapper[4594]: I1129 05:44:54.093460 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b204dff1-e3d8-467f-b95f-372c0abbb1de" path="/var/lib/kubelet/pods/b204dff1-e3d8-467f-b95f-372c0abbb1de/volumes" Nov 29 05:44:54 crc kubenswrapper[4594]: I1129 05:44:54.095083 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7abfaea-4592-4e64-bcbc-5fa0331587a8" path="/var/lib/kubelet/pods/d7abfaea-4592-4e64-bcbc-5fa0331587a8/volumes" Nov 29 05:44:54 crc kubenswrapper[4594]: I1129 05:44:54.096401 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc29fae8-4b90-4702-8c68-4cec263fd4c9" path="/var/lib/kubelet/pods/fc29fae8-4b90-4702-8c68-4cec263fd4c9/volumes" Nov 29 05:44:54 crc kubenswrapper[4594]: I1129 05:44:54.097097 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff6b46fc-e420-41b4-b2f1-06ff16d73595" path="/var/lib/kubelet/pods/ff6b46fc-e420-41b4-b2f1-06ff16d73595/volumes" Nov 29 05:44:54 crc kubenswrapper[4594]: I1129 05:44:54.244113 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8849f48-9rr52" event={"ID":"e8d01e45-1d76-464f-93e6-965d65a055fc","Type":"ContainerStarted","Data":"5ac36bd5450c58ccb44f66fa2fc8a56f6b7dd613a59651336f4d79e162161576"} Nov 29 05:44:54 crc kubenswrapper[4594]: I1129 05:44:54.245542 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-558d4b85cb-k5j98" event={"ID":"19dfde1f-d770-45ec-8735-78549b8fcb90","Type":"ContainerStarted","Data":"85c768cf1d5a78371cc74518076e00a13d6c0268223796b3735eca1156c9e871"} Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.402489 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5c7df66d69-hd8nh"] Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.405789 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.407519 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.407758 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.414130 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c7df66d69-hd8nh"] Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.438438 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-public-tls-certs\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.438495 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7np9\" (UniqueName: \"kubernetes.io/projected/23f6e7de-b25b-4522-8368-cd17f44dc109-kube-api-access-m7np9\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.438703 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-combined-ca-bundle\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.438732 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-config\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.438784 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-httpd-config\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.438854 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-ovndb-tls-certs\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.438936 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-internal-tls-certs\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.540401 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-public-tls-certs\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.540443 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7np9\" (UniqueName: \"kubernetes.io/projected/23f6e7de-b25b-4522-8368-cd17f44dc109-kube-api-access-m7np9\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.540537 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-combined-ca-bundle\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.540559 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-config\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.540589 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-httpd-config\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.540622 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-ovndb-tls-certs\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.540668 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-internal-tls-certs\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.547241 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-internal-tls-certs\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.547671 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-public-tls-certs\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.549895 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-config\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.551708 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-httpd-config\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.553740 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-ovndb-tls-certs\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.557612 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f6e7de-b25b-4522-8368-cd17f44dc109-combined-ca-bundle\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.562842 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7np9\" (UniqueName: \"kubernetes.io/projected/23f6e7de-b25b-4522-8368-cd17f44dc109-kube-api-access-m7np9\") pod \"neutron-5c7df66d69-hd8nh\" (UID: \"23f6e7de-b25b-4522-8368-cd17f44dc109\") " pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:55 crc kubenswrapper[4594]: I1129 05:44:55.723159 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:44:56 crc kubenswrapper[4594]: I1129 05:44:56.264282 4594 generic.go:334] "Generic (PLEG): container finished" podID="7474ba72-1534-4193-9590-e46dfc840403" containerID="420c1c005e74da6ed5d7566f41328c8fed77e3abf21045a59554138235cc373e" exitCode=0 Nov 29 05:44:56 crc kubenswrapper[4594]: I1129 05:44:56.264526 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-bg92f" event={"ID":"7474ba72-1534-4193-9590-e46dfc840403","Type":"ContainerDied","Data":"420c1c005e74da6ed5d7566f41328c8fed77e3abf21045a59554138235cc373e"} Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.150429 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2"] Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.153911 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.157305 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.159515 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.159693 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2"] Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.230142 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17adfde2-389b-491b-8c00-8293e37021b4-secret-volume\") pod \"collect-profiles-29406585-kz8m2\" (UID: \"17adfde2-389b-491b-8c00-8293e37021b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.230501 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw4rl\" (UniqueName: \"kubernetes.io/projected/17adfde2-389b-491b-8c00-8293e37021b4-kube-api-access-xw4rl\") pod \"collect-profiles-29406585-kz8m2\" (UID: \"17adfde2-389b-491b-8c00-8293e37021b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.230547 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17adfde2-389b-491b-8c00-8293e37021b4-config-volume\") pod \"collect-profiles-29406585-kz8m2\" (UID: \"17adfde2-389b-491b-8c00-8293e37021b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.331537 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17adfde2-389b-491b-8c00-8293e37021b4-secret-volume\") pod \"collect-profiles-29406585-kz8m2\" (UID: \"17adfde2-389b-491b-8c00-8293e37021b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.331602 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw4rl\" (UniqueName: \"kubernetes.io/projected/17adfde2-389b-491b-8c00-8293e37021b4-kube-api-access-xw4rl\") pod \"collect-profiles-29406585-kz8m2\" (UID: \"17adfde2-389b-491b-8c00-8293e37021b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.331634 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17adfde2-389b-491b-8c00-8293e37021b4-config-volume\") pod \"collect-profiles-29406585-kz8m2\" (UID: \"17adfde2-389b-491b-8c00-8293e37021b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.332576 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17adfde2-389b-491b-8c00-8293e37021b4-config-volume\") pod \"collect-profiles-29406585-kz8m2\" (UID: \"17adfde2-389b-491b-8c00-8293e37021b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.338347 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17adfde2-389b-491b-8c00-8293e37021b4-secret-volume\") pod \"collect-profiles-29406585-kz8m2\" (UID: \"17adfde2-389b-491b-8c00-8293e37021b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.347002 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw4rl\" (UniqueName: \"kubernetes.io/projected/17adfde2-389b-491b-8c00-8293e37021b4-kube-api-access-xw4rl\") pod \"collect-profiles-29406585-kz8m2\" (UID: \"17adfde2-389b-491b-8c00-8293e37021b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" Nov 29 05:45:00 crc kubenswrapper[4594]: I1129 05:45:00.477150 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" Nov 29 05:45:02 crc kubenswrapper[4594]: I1129 05:45:02.659818 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-bg92f" Nov 29 05:45:02 crc kubenswrapper[4594]: I1129 05:45:02.788322 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-db-sync-config-data\") pod \"7474ba72-1534-4193-9590-e46dfc840403\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " Nov 29 05:45:02 crc kubenswrapper[4594]: I1129 05:45:02.788617 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-combined-ca-bundle\") pod \"7474ba72-1534-4193-9590-e46dfc840403\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " Nov 29 05:45:02 crc kubenswrapper[4594]: I1129 05:45:02.788788 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw2j2\" (UniqueName: \"kubernetes.io/projected/7474ba72-1534-4193-9590-e46dfc840403-kube-api-access-jw2j2\") pod \"7474ba72-1534-4193-9590-e46dfc840403\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " Nov 29 05:45:02 crc kubenswrapper[4594]: I1129 05:45:02.788905 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-config-data\") pod \"7474ba72-1534-4193-9590-e46dfc840403\" (UID: \"7474ba72-1534-4193-9590-e46dfc840403\") " Nov 29 05:45:02 crc kubenswrapper[4594]: I1129 05:45:02.794568 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7474ba72-1534-4193-9590-e46dfc840403" (UID: "7474ba72-1534-4193-9590-e46dfc840403"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:02 crc kubenswrapper[4594]: I1129 05:45:02.809673 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7474ba72-1534-4193-9590-e46dfc840403-kube-api-access-jw2j2" (OuterVolumeSpecName: "kube-api-access-jw2j2") pod "7474ba72-1534-4193-9590-e46dfc840403" (UID: "7474ba72-1534-4193-9590-e46dfc840403"). InnerVolumeSpecName "kube-api-access-jw2j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:02 crc kubenswrapper[4594]: I1129 05:45:02.826074 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7474ba72-1534-4193-9590-e46dfc840403" (UID: "7474ba72-1534-4193-9590-e46dfc840403"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:02 crc kubenswrapper[4594]: I1129 05:45:02.832228 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-config-data" (OuterVolumeSpecName: "config-data") pod "7474ba72-1534-4193-9590-e46dfc840403" (UID: "7474ba72-1534-4193-9590-e46dfc840403"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:02 crc kubenswrapper[4594]: I1129 05:45:02.894528 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw2j2\" (UniqueName: \"kubernetes.io/projected/7474ba72-1534-4193-9590-e46dfc840403-kube-api-access-jw2j2\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:02 crc kubenswrapper[4594]: I1129 05:45:02.894559 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:02 crc kubenswrapper[4594]: I1129 05:45:02.894571 4594 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:02 crc kubenswrapper[4594]: I1129 05:45:02.894579 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7474ba72-1534-4193-9590-e46dfc840403-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.369357 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6cf4cc7b4f-hg8dz" podUID="9ac3735b-d8a9-4723-8576-dc16c7af5756" containerName="horizon-log" containerID="cri-o://cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7" gracePeriod=30 Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.369434 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6cf4cc7b4f-hg8dz" podUID="9ac3735b-d8a9-4723-8576-dc16c7af5756" containerName="horizon" containerID="cri-o://c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1" gracePeriod=30 Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.369762 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cf4cc7b4f-hg8dz" event={"ID":"9ac3735b-d8a9-4723-8576-dc16c7af5756","Type":"ContainerStarted","Data":"c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1"} Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.374140 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-bg92f" event={"ID":"7474ba72-1534-4193-9590-e46dfc840403","Type":"ContainerDied","Data":"f90d6b0a8b94b2068be08d2eb73761e0cb0ecf7030b282151f887d69440e461f"} Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.374178 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f90d6b0a8b94b2068be08d2eb73761e0cb0ecf7030b282151f887d69440e461f" Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.374245 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-bg92f" Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.404766 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6cf4cc7b4f-hg8dz" podStartSLOduration=13.340536337 podStartE2EDuration="28.404754404s" podCreationTimestamp="2025-11-29 05:44:35 +0000 UTC" firstStartedPulling="2025-11-29 05:44:36.577768768 +0000 UTC m=+1000.818277988" lastFinishedPulling="2025-11-29 05:44:51.641986835 +0000 UTC m=+1015.882496055" observedRunningTime="2025-11-29 05:45:03.396455783 +0000 UTC m=+1027.636965004" watchObservedRunningTime="2025-11-29 05:45:03.404754404 +0000 UTC m=+1027.645263625" Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.685371 4594 scope.go:117] "RemoveContainer" containerID="f628d58237ac93ac4654f52cca472d9e568a6dd0158ee585f6ff8558534a7d70" Nov 29 05:45:03 crc kubenswrapper[4594]: E1129 05:45:03.716630 4594 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current" Nov 29 05:45:03 crc kubenswrapper[4594]: E1129 05:45:03.716684 4594 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current" Nov 29 05:45:03 crc kubenswrapper[4594]: E1129 05:45:03.716832 4594 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26m6r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9rddc_openstack(4b46224c-7874-4c4a-abb6-f1cbef3a8462): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 05:45:03 crc kubenswrapper[4594]: E1129 05:45:03.718536 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9rddc" podUID="4b46224c-7874-4c4a-abb6-f1cbef3a8462" Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.982145 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Nov 29 05:45:03 crc kubenswrapper[4594]: E1129 05:45:03.982946 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7474ba72-1534-4193-9590-e46dfc840403" containerName="watcher-db-sync" Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.982967 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="7474ba72-1534-4193-9590-e46dfc840403" containerName="watcher-db-sync" Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.983193 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="7474ba72-1534-4193-9590-e46dfc840403" containerName="watcher-db-sync" Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.983897 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.988758 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-xzztn" Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.988984 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Nov 29 05:45:03 crc kubenswrapper[4594]: I1129 05:45:03.996208 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.020518 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-logs\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.020555 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.022290 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.022340 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.022359 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f55j2\" (UniqueName: \"kubernetes.io/projected/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-kube-api-access-f55j2\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.070026 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.076919 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.081865 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.123851 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-logs\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.123900 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.124193 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-logs\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.125127 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.125202 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.125225 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f55j2\" (UniqueName: \"kubernetes.io/projected/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-kube-api-access-f55j2\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.132037 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.132948 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.134598 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.144830 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.144858 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.151515 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f55j2\" (UniqueName: \"kubernetes.io/projected/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-kube-api-access-f55j2\") pod \"watcher-decision-engine-0\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.154059 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.154145 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.157359 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.227408 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58qnb\" (UniqueName: \"kubernetes.io/projected/ba31669d-d47f-41bc-8476-7acf2a872d15-kube-api-access-58qnb\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.227464 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.227488 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba31669d-d47f-41bc-8476-7acf2a872d15-logs\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.227545 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-config-data\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.227628 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.227644 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s7sh\" (UniqueName: \"kubernetes.io/projected/cfbd6de3-fc8d-4d93-a76d-fd2b8a196167-kube-api-access-5s7sh\") pod \"watcher-applier-0\" (UID: \"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167\") " pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.228062 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbd6de3-fc8d-4d93-a76d-fd2b8a196167-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167\") " pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.228118 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfbd6de3-fc8d-4d93-a76d-fd2b8a196167-logs\") pod \"watcher-applier-0\" (UID: \"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167\") " pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.228206 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbd6de3-fc8d-4d93-a76d-fd2b8a196167-config-data\") pod \"watcher-applier-0\" (UID: \"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167\") " pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.325210 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.330661 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58qnb\" (UniqueName: \"kubernetes.io/projected/ba31669d-d47f-41bc-8476-7acf2a872d15-kube-api-access-58qnb\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.330706 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.330736 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba31669d-d47f-41bc-8476-7acf2a872d15-logs\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.330769 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-config-data\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.330815 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.330835 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s7sh\" (UniqueName: \"kubernetes.io/projected/cfbd6de3-fc8d-4d93-a76d-fd2b8a196167-kube-api-access-5s7sh\") pod \"watcher-applier-0\" (UID: \"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167\") " pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.330890 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbd6de3-fc8d-4d93-a76d-fd2b8a196167-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167\") " pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.330914 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfbd6de3-fc8d-4d93-a76d-fd2b8a196167-logs\") pod \"watcher-applier-0\" (UID: \"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167\") " pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.330951 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbd6de3-fc8d-4d93-a76d-fd2b8a196167-config-data\") pod \"watcher-applier-0\" (UID: \"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167\") " pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.331080 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba31669d-d47f-41bc-8476-7acf2a872d15-logs\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.335521 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.336824 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfbd6de3-fc8d-4d93-a76d-fd2b8a196167-logs\") pod \"watcher-applier-0\" (UID: \"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167\") " pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.337480 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.341425 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbd6de3-fc8d-4d93-a76d-fd2b8a196167-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167\") " pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.344515 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbd6de3-fc8d-4d93-a76d-fd2b8a196167-config-data\") pod \"watcher-applier-0\" (UID: \"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167\") " pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.345238 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-config-data\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.348681 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58qnb\" (UniqueName: \"kubernetes.io/projected/ba31669d-d47f-41bc-8476-7acf2a872d15-kube-api-access-58qnb\") pod \"watcher-api-0\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.354658 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s7sh\" (UniqueName: \"kubernetes.io/projected/cfbd6de3-fc8d-4d93-a76d-fd2b8a196167-kube-api-access-5s7sh\") pod \"watcher-applier-0\" (UID: \"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167\") " pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.357954 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.386486 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bf6f9df7f-knsk7" event={"ID":"9c4f4aa8-f25f-4684-8707-3bc0eb168954","Type":"ContainerStarted","Data":"8d5eb8d12b40086236a654309921e326c5ea36c0cb093bef3e74044402fe4a45"} Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.386607 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7bf6f9df7f-knsk7" podUID="9c4f4aa8-f25f-4684-8707-3bc0eb168954" containerName="horizon-log" containerID="cri-o://c63f69a44e3afa1d1ac33136814d6b792d5e9273aa32f43e16c6771170da44e1" gracePeriod=30 Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.386745 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7bf6f9df7f-knsk7" podUID="9c4f4aa8-f25f-4684-8707-3bc0eb168954" containerName="horizon" containerID="cri-o://8d5eb8d12b40086236a654309921e326c5ea36c0cb093bef3e74044402fe4a45" gracePeriod=30 Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.390976 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-658d8f8767-q5w5m" event={"ID":"e786ace8-0d46-488a-941c-2325002c5edc","Type":"ContainerStarted","Data":"49db1a73a6aa6d53e47df61ed8cc87575dc878c77038b38da031e4b677a8a131"} Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.391110 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-658d8f8767-q5w5m" podUID="e786ace8-0d46-488a-941c-2325002c5edc" containerName="horizon-log" containerID="cri-o://5ebe1b1187197d5f07df1ccb9f78604659450fccdeb6a5cc9fda337a0746d245" gracePeriod=30 Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.391240 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-658d8f8767-q5w5m" podUID="e786ace8-0d46-488a-941c-2325002c5edc" containerName="horizon" containerID="cri-o://49db1a73a6aa6d53e47df61ed8cc87575dc878c77038b38da031e4b677a8a131" gracePeriod=30 Nov 29 05:45:04 crc kubenswrapper[4594]: E1129 05:45:04.398427 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current\\\"\"" pod="openstack/cinder-db-sync-9rddc" podUID="4b46224c-7874-4c4a-abb6-f1cbef3a8462" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.404750 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7bf6f9df7f-knsk7" podStartSLOduration=14.500256553 podStartE2EDuration="31.404731746s" podCreationTimestamp="2025-11-29 05:44:33 +0000 UTC" firstStartedPulling="2025-11-29 05:44:34.720358913 +0000 UTC m=+998.960868134" lastFinishedPulling="2025-11-29 05:44:51.624834107 +0000 UTC m=+1015.865343327" observedRunningTime="2025-11-29 05:45:04.400735779 +0000 UTC m=+1028.641244999" watchObservedRunningTime="2025-11-29 05:45:04.404731746 +0000 UTC m=+1028.645240966" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.424876 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.447029 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-658d8f8767-q5w5m" podStartSLOduration=14.636924745 podStartE2EDuration="31.447007867s" podCreationTimestamp="2025-11-29 05:44:33 +0000 UTC" firstStartedPulling="2025-11-29 05:44:34.830934691 +0000 UTC m=+999.071443911" lastFinishedPulling="2025-11-29 05:44:51.641017812 +0000 UTC m=+1015.881527033" observedRunningTime="2025-11-29 05:45:04.439057701 +0000 UTC m=+1028.679566911" watchObservedRunningTime="2025-11-29 05:45:04.447007867 +0000 UTC m=+1028.687517088" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.475010 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.551018 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c7df66d69-hd8nh"] Nov 29 05:45:04 crc kubenswrapper[4594]: W1129 05:45:04.605871 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23f6e7de_b25b_4522_8368_cd17f44dc109.slice/crio-d3475e332b2a2f498dcb27576b7374873005c8ea347fb103aa770f0dba883fe8 WatchSource:0}: Error finding container d3475e332b2a2f498dcb27576b7374873005c8ea347fb103aa770f0dba883fe8: Status 404 returned error can't find the container with id d3475e332b2a2f498dcb27576b7374873005c8ea347fb103aa770f0dba883fe8 Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.646920 4594 scope.go:117] "RemoveContainer" containerID="7b64eb26edf2e1f199277eafed0d901a457c9203a5c6a9996c35e7ec77b12704" Nov 29 05:45:04 crc kubenswrapper[4594]: I1129 05:45:04.941600 4594 scope.go:117] "RemoveContainer" containerID="b030cad5054f07cf573e32c26cb8febee42aa8b7d06fdec89c7ef15bb8ab4474" Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.061323 4594 scope.go:117] "RemoveContainer" containerID="712a83c1e61460790ca8a7f445e196684c977353eb35973b0848a93715bc43e5" Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.208774 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gkrjf"] Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.234781 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b86667d5-d25cf"] Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.300421 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.415611 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8849f48-9rr52" event={"ID":"e8d01e45-1d76-464f-93e6-965d65a055fc","Type":"ContainerStarted","Data":"0df5f632c7bce5417787cef5afd154fcfdf04ab056713da4bdcdd13461f932ca"} Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.432384 4594 scope.go:117] "RemoveContainer" containerID="0ca4986760b3a8419ae6b79414ba7dd532de677c2b5b56b8bd320d52fa808769" Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.437687 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7fb8849f48-9rr52" podStartSLOduration=20.437672579 podStartE2EDuration="20.437672579s" podCreationTimestamp="2025-11-29 05:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:05.435911667 +0000 UTC m=+1029.676420887" watchObservedRunningTime="2025-11-29 05:45:05.437672579 +0000 UTC m=+1029.678181790" Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.441367 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c7df66d69-hd8nh" event={"ID":"23f6e7de-b25b-4522-8368-cd17f44dc109","Type":"ContainerStarted","Data":"3adabb984c02b30be212543d1e81748d632601a8d519ebaae4bc024c287e7e05"} Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.441391 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c7df66d69-hd8nh" event={"ID":"23f6e7de-b25b-4522-8368-cd17f44dc109","Type":"ContainerStarted","Data":"d3475e332b2a2f498dcb27576b7374873005c8ea347fb103aa770f0dba883fe8"} Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.443836 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"075b6980-e8ff-4248-ad5e-fb33fd7199d3","Type":"ContainerStarted","Data":"89d8db4669bcb1fd810e56de63ffe44d8d9010c5458f840b90425f1ed2cad8a1"} Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.450438 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-558d4b85cb-k5j98" event={"ID":"19dfde1f-d770-45ec-8735-78549b8fcb90","Type":"ContainerStarted","Data":"55b376ecc6ce3543e9e8b28018648e5c048b14ec18596f64787506c4bf737876"} Nov 29 05:45:05 crc kubenswrapper[4594]: W1129 05:45:05.469582 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf39fdd2d_1f5e_440f_996c_08ae2e749d3e.slice/crio-4b49af9bfd9c385e25c3940a8c8db196d252c9cf4f20f0c2c0e9c00fa4ca834b WatchSource:0}: Error finding container 4b49af9bfd9c385e25c3940a8c8db196d252c9cf4f20f0c2c0e9c00fa4ca834b: Status 404 returned error can't find the container with id 4b49af9bfd9c385e25c3940a8c8db196d252c9cf4f20f0c2c0e9c00fa4ca834b Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.475092 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-558d4b85cb-k5j98" podStartSLOduration=20.475076916 podStartE2EDuration="20.475076916s" podCreationTimestamp="2025-11-29 05:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:05.469053043 +0000 UTC m=+1029.709562264" watchObservedRunningTime="2025-11-29 05:45:05.475076916 +0000 UTC m=+1029.715586135" Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.517548 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 29 05:45:05 crc kubenswrapper[4594]: W1129 05:45:05.564047 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17adfde2_389b_491b_8c00_8293e37021b4.slice/crio-d579a75da48476d6db1cbfb10ad68a552d1add6602a02b6d7903c1956f6bfbe3 WatchSource:0}: Error finding container d579a75da48476d6db1cbfb10ad68a552d1add6602a02b6d7903c1956f6bfbe3: Status 404 returned error can't find the container with id d579a75da48476d6db1cbfb10ad68a552d1add6602a02b6d7903c1956f6bfbe3 Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.572773 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2"] Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.585617 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66bbc5d8dd-fzbhl"] Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.662209 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.680124 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.691941 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Nov 29 05:45:05 crc kubenswrapper[4594]: I1129 05:45:05.721365 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.168286 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.168562 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.244346 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.244400 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.524138 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8jkv9" event={"ID":"69498abf-8b80-4b7f-901a-4c6a4bdede2f","Type":"ContainerStarted","Data":"4f378e96290122a024a00bc092e3a9d370ef7e8823da038de4d436c3e6442832"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.531034 4594 generic.go:334] "Generic (PLEG): container finished" podID="f39fdd2d-1f5e-440f-996c-08ae2e749d3e" containerID="fb7d3a9347a69aee0ab41a123cf4140d30540c8c94392eb9202bf28875a59765" exitCode=0 Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.531088 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" event={"ID":"f39fdd2d-1f5e-440f-996c-08ae2e749d3e","Type":"ContainerDied","Data":"fb7d3a9347a69aee0ab41a123cf4140d30540c8c94392eb9202bf28875a59765"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.531106 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" event={"ID":"f39fdd2d-1f5e-440f-996c-08ae2e749d3e","Type":"ContainerStarted","Data":"4b49af9bfd9c385e25c3940a8c8db196d252c9cf4f20f0c2c0e9c00fa4ca834b"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.536597 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c7df66d69-hd8nh" event={"ID":"23f6e7de-b25b-4522-8368-cd17f44dc109","Type":"ContainerStarted","Data":"64dc2825549659cb988f04b641f5cb0e70043d4f0b73d5a9c8e6c068aa1532a2"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.537479 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.550307 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8jkv9" podStartSLOduration=2.614660534 podStartE2EDuration="33.550286565s" podCreationTimestamp="2025-11-29 05:44:33 +0000 UTC" firstStartedPulling="2025-11-29 05:44:34.63124448 +0000 UTC m=+998.871753699" lastFinishedPulling="2025-11-29 05:45:05.56687051 +0000 UTC m=+1029.807379730" observedRunningTime="2025-11-29 05:45:06.547830646 +0000 UTC m=+1030.788339865" watchObservedRunningTime="2025-11-29 05:45:06.550286565 +0000 UTC m=+1030.790795785" Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.566974 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"315c99c4-4c5f-4bff-9ff0-116d4e7bf846","Type":"ContainerStarted","Data":"0a1cc8a404d7cf015ad0e183d17c9b05231f2d0deb67012948172391410ebe64"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.607800 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfe84a63-fea5-455e-95d3-523a091b976f","Type":"ContainerStarted","Data":"fc49a19c1a1fc2d60ca494b34631f397817d2181fa37036731550878d86b8601"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.627456 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5c7df66d69-hd8nh" podStartSLOduration=11.627434187 podStartE2EDuration="11.627434187s" podCreationTimestamp="2025-11-29 05:44:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:06.606962295 +0000 UTC m=+1030.847471525" watchObservedRunningTime="2025-11-29 05:45:06.627434187 +0000 UTC m=+1030.867943408" Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.669740 4594 generic.go:334] "Generic (PLEG): container finished" podID="17adfde2-389b-491b-8c00-8293e37021b4" containerID="b2afaff1905126455942d8e0ed084ec1dc7230e82c53f586130ed351e00ec8ca" exitCode=0 Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.669831 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" event={"ID":"17adfde2-389b-491b-8c00-8293e37021b4","Type":"ContainerDied","Data":"b2afaff1905126455942d8e0ed084ec1dc7230e82c53f586130ed351e00ec8ca"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.669883 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" event={"ID":"17adfde2-389b-491b-8c00-8293e37021b4","Type":"ContainerStarted","Data":"d579a75da48476d6db1cbfb10ad68a552d1add6602a02b6d7903c1956f6bfbe3"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.696884 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mvvfl" event={"ID":"72edd67c-f301-4e33-8d71-74f9bc7b99c5","Type":"ContainerStarted","Data":"1d568eb8477a4f4096f144cd479a748b8bc8e9654f83a8c2a157e58825d2cdfe"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.708784 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gkrjf" event={"ID":"d7cc6b30-a981-4486-9d8a-e926167f001b","Type":"ContainerStarted","Data":"ff5b256588941b410fcff611cfa2fc2fb38f9bc1742bbecf88bf769aa253ad0b"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.708825 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gkrjf" event={"ID":"d7cc6b30-a981-4486-9d8a-e926167f001b","Type":"ContainerStarted","Data":"77355ba65d78e080d8b8abe611d6572d1d2968913037f7d4c7da5df7c698cd63"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.718804 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167","Type":"ContainerStarted","Data":"ad5a87f21ab512cc78216e52155594e9ab3fe329808d1683bd64f44acde15b85"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.757336 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66bbc5d8dd-fzbhl" event={"ID":"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be","Type":"ContainerStarted","Data":"fed7d5cea1157666e5f6c9c5e8caeb1eb905bb4c456b268b79cea70ea1240266"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.762867 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"075b6980-e8ff-4248-ad5e-fb33fd7199d3","Type":"ContainerStarted","Data":"620cd815098d08aef3c32da10723fa7a38371fa50e7c93cd5df9925729d55756"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.764189 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d60aa94-3720-43b5-a5de-ca30dd1b63b2","Type":"ContainerStarted","Data":"2b246cf4365743fd554a7900f9a1edf20d90a3e0837ed42e1a78a4c5266dd7c0"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.765467 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ba31669d-d47f-41bc-8476-7acf2a872d15","Type":"ContainerStarted","Data":"3497f3820c6a3775147afba51b6d933e1e59dc64f3e791434f549f0cc2d9e3bb"} Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.874899 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-mvvfl" podStartSLOduration=3.152178286 podStartE2EDuration="33.874880669s" podCreationTimestamp="2025-11-29 05:44:33 +0000 UTC" firstStartedPulling="2025-11-29 05:44:34.844375487 +0000 UTC m=+999.084884707" lastFinishedPulling="2025-11-29 05:45:05.56707787 +0000 UTC m=+1029.807587090" observedRunningTime="2025-11-29 05:45:06.716344758 +0000 UTC m=+1030.956853979" watchObservedRunningTime="2025-11-29 05:45:06.874880669 +0000 UTC m=+1031.115389890" Nov 29 05:45:06 crc kubenswrapper[4594]: I1129 05:45:06.880464 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gkrjf" podStartSLOduration=14.880454385 podStartE2EDuration="14.880454385s" podCreationTimestamp="2025-11-29 05:44:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:06.73245747 +0000 UTC m=+1030.972966690" watchObservedRunningTime="2025-11-29 05:45:06.880454385 +0000 UTC m=+1031.120963605" Nov 29 05:45:07 crc kubenswrapper[4594]: I1129 05:45:07.899015 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" event={"ID":"f39fdd2d-1f5e-440f-996c-08ae2e749d3e","Type":"ContainerStarted","Data":"45619c64119c3a13ccbf693a78ab7548d824f21f6d3ee0dc4ad484f5a286877f"} Nov 29 05:45:07 crc kubenswrapper[4594]: I1129 05:45:07.900466 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:45:07 crc kubenswrapper[4594]: I1129 05:45:07.943985 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66bbc5d8dd-fzbhl" event={"ID":"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be","Type":"ContainerStarted","Data":"76ce69fbb890971373cf2b97b55d27c09c8d857179fc93efd2f64f8e76bc3b7e"} Nov 29 05:45:07 crc kubenswrapper[4594]: I1129 05:45:07.944043 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66bbc5d8dd-fzbhl" event={"ID":"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be","Type":"ContainerStarted","Data":"dbe13314cb482f0e31502e38204649a154044a8792e73d9bcdadd76c3ad1a649"} Nov 29 05:45:07 crc kubenswrapper[4594]: I1129 05:45:07.944114 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:45:07 crc kubenswrapper[4594]: I1129 05:45:07.964065 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"075b6980-e8ff-4248-ad5e-fb33fd7199d3","Type":"ContainerStarted","Data":"0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c"} Nov 29 05:45:07 crc kubenswrapper[4594]: I1129 05:45:07.970613 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" podStartSLOduration=15.970597028 podStartE2EDuration="15.970597028s" podCreationTimestamp="2025-11-29 05:44:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:07.955637355 +0000 UTC m=+1032.196146575" watchObservedRunningTime="2025-11-29 05:45:07.970597028 +0000 UTC m=+1032.211106247" Nov 29 05:45:07 crc kubenswrapper[4594]: I1129 05:45:07.981103 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"315c99c4-4c5f-4bff-9ff0-116d4e7bf846","Type":"ContainerStarted","Data":"50aa13ccfea274f02bf9c5b2b2aa6e0ecf276429506012a42ba955bff82ae8f3"} Nov 29 05:45:07 crc kubenswrapper[4594]: I1129 05:45:07.990635 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ba31669d-d47f-41bc-8476-7acf2a872d15","Type":"ContainerStarted","Data":"707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497"} Nov 29 05:45:07 crc kubenswrapper[4594]: I1129 05:45:07.990703 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ba31669d-d47f-41bc-8476-7acf2a872d15","Type":"ContainerStarted","Data":"30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7"} Nov 29 05:45:07 crc kubenswrapper[4594]: I1129 05:45:07.990719 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Nov 29 05:45:08 crc kubenswrapper[4594]: I1129 05:45:08.002027 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-66bbc5d8dd-fzbhl" podStartSLOduration=16.00200401 podStartE2EDuration="16.00200401s" podCreationTimestamp="2025-11-29 05:44:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:07.986450261 +0000 UTC m=+1032.226959470" watchObservedRunningTime="2025-11-29 05:45:08.00200401 +0000 UTC m=+1032.242513231" Nov 29 05:45:08 crc kubenswrapper[4594]: I1129 05:45:08.027960 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=15.027945062 podStartE2EDuration="15.027945062s" podCreationTimestamp="2025-11-29 05:44:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:08.011815228 +0000 UTC m=+1032.252324449" watchObservedRunningTime="2025-11-29 05:45:08.027945062 +0000 UTC m=+1032.268454282" Nov 29 05:45:08 crc kubenswrapper[4594]: I1129 05:45:08.159887 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=4.159867156 podStartE2EDuration="4.159867156s" podCreationTimestamp="2025-11-29 05:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:08.157580766 +0000 UTC m=+1032.398089986" watchObservedRunningTime="2025-11-29 05:45:08.159867156 +0000 UTC m=+1032.400376377" Nov 29 05:45:08 crc kubenswrapper[4594]: I1129 05:45:08.561024 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" Nov 29 05:45:08 crc kubenswrapper[4594]: I1129 05:45:08.709521 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw4rl\" (UniqueName: \"kubernetes.io/projected/17adfde2-389b-491b-8c00-8293e37021b4-kube-api-access-xw4rl\") pod \"17adfde2-389b-491b-8c00-8293e37021b4\" (UID: \"17adfde2-389b-491b-8c00-8293e37021b4\") " Nov 29 05:45:08 crc kubenswrapper[4594]: I1129 05:45:08.709741 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17adfde2-389b-491b-8c00-8293e37021b4-secret-volume\") pod \"17adfde2-389b-491b-8c00-8293e37021b4\" (UID: \"17adfde2-389b-491b-8c00-8293e37021b4\") " Nov 29 05:45:08 crc kubenswrapper[4594]: I1129 05:45:08.709839 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17adfde2-389b-491b-8c00-8293e37021b4-config-volume\") pod \"17adfde2-389b-491b-8c00-8293e37021b4\" (UID: \"17adfde2-389b-491b-8c00-8293e37021b4\") " Nov 29 05:45:08 crc kubenswrapper[4594]: I1129 05:45:08.711446 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17adfde2-389b-491b-8c00-8293e37021b4-config-volume" (OuterVolumeSpecName: "config-volume") pod "17adfde2-389b-491b-8c00-8293e37021b4" (UID: "17adfde2-389b-491b-8c00-8293e37021b4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:08 crc kubenswrapper[4594]: I1129 05:45:08.722668 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17adfde2-389b-491b-8c00-8293e37021b4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "17adfde2-389b-491b-8c00-8293e37021b4" (UID: "17adfde2-389b-491b-8c00-8293e37021b4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:08 crc kubenswrapper[4594]: I1129 05:45:08.737522 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17adfde2-389b-491b-8c00-8293e37021b4-kube-api-access-xw4rl" (OuterVolumeSpecName: "kube-api-access-xw4rl") pod "17adfde2-389b-491b-8c00-8293e37021b4" (UID: "17adfde2-389b-491b-8c00-8293e37021b4"). InnerVolumeSpecName "kube-api-access-xw4rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:08 crc kubenswrapper[4594]: I1129 05:45:08.816179 4594 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17adfde2-389b-491b-8c00-8293e37021b4-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:08 crc kubenswrapper[4594]: I1129 05:45:08.816477 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw4rl\" (UniqueName: \"kubernetes.io/projected/17adfde2-389b-491b-8c00-8293e37021b4-kube-api-access-xw4rl\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:08 crc kubenswrapper[4594]: I1129 05:45:08.816491 4594 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17adfde2-389b-491b-8c00-8293e37021b4-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:09 crc kubenswrapper[4594]: I1129 05:45:09.026182 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" event={"ID":"17adfde2-389b-491b-8c00-8293e37021b4","Type":"ContainerDied","Data":"d579a75da48476d6db1cbfb10ad68a552d1add6602a02b6d7903c1956f6bfbe3"} Nov 29 05:45:09 crc kubenswrapper[4594]: I1129 05:45:09.026227 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d579a75da48476d6db1cbfb10ad68a552d1add6602a02b6d7903c1956f6bfbe3" Nov 29 05:45:09 crc kubenswrapper[4594]: I1129 05:45:09.026303 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2" Nov 29 05:45:09 crc kubenswrapper[4594]: I1129 05:45:09.042279 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"315c99c4-4c5f-4bff-9ff0-116d4e7bf846","Type":"ContainerStarted","Data":"53870e54fb8d4798b9443c1d05850e7f7e5082e89e538969f556b5928f4472ad"} Nov 29 05:45:09 crc kubenswrapper[4594]: I1129 05:45:09.426092 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Nov 29 05:45:09 crc kubenswrapper[4594]: I1129 05:45:09.592099 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=17.592078267 podStartE2EDuration="17.592078267s" podCreationTimestamp="2025-11-29 05:44:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:09.067755047 +0000 UTC m=+1033.308264267" watchObservedRunningTime="2025-11-29 05:45:09.592078267 +0000 UTC m=+1033.832587487" Nov 29 05:45:10 crc kubenswrapper[4594]: I1129 05:45:10.068531 4594 generic.go:334] "Generic (PLEG): container finished" podID="69498abf-8b80-4b7f-901a-4c6a4bdede2f" containerID="4f378e96290122a024a00bc092e3a9d370ef7e8823da038de4d436c3e6442832" exitCode=0 Nov 29 05:45:10 crc kubenswrapper[4594]: I1129 05:45:10.068589 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8jkv9" event={"ID":"69498abf-8b80-4b7f-901a-4c6a4bdede2f","Type":"ContainerDied","Data":"4f378e96290122a024a00bc092e3a9d370ef7e8823da038de4d436c3e6442832"} Nov 29 05:45:10 crc kubenswrapper[4594]: I1129 05:45:10.073503 4594 generic.go:334] "Generic (PLEG): container finished" podID="72edd67c-f301-4e33-8d71-74f9bc7b99c5" containerID="1d568eb8477a4f4096f144cd479a748b8bc8e9654f83a8c2a157e58825d2cdfe" exitCode=0 Nov 29 05:45:10 crc kubenswrapper[4594]: I1129 05:45:10.074399 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mvvfl" event={"ID":"72edd67c-f301-4e33-8d71-74f9bc7b99c5","Type":"ContainerDied","Data":"1d568eb8477a4f4096f144cd479a748b8bc8e9654f83a8c2a157e58825d2cdfe"} Nov 29 05:45:10 crc kubenswrapper[4594]: I1129 05:45:10.074502 4594 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 05:45:10 crc kubenswrapper[4594]: I1129 05:45:10.849308 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Nov 29 05:45:11 crc kubenswrapper[4594]: I1129 05:45:11.108897 4594 generic.go:334] "Generic (PLEG): container finished" podID="d7cc6b30-a981-4486-9d8a-e926167f001b" containerID="ff5b256588941b410fcff611cfa2fc2fb38f9bc1742bbecf88bf769aa253ad0b" exitCode=0 Nov 29 05:45:11 crc kubenswrapper[4594]: I1129 05:45:11.109394 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gkrjf" event={"ID":"d7cc6b30-a981-4486-9d8a-e926167f001b","Type":"ContainerDied","Data":"ff5b256588941b410fcff611cfa2fc2fb38f9bc1742bbecf88bf769aa253ad0b"} Nov 29 05:45:12 crc kubenswrapper[4594]: I1129 05:45:12.122062 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"cfbd6de3-fc8d-4d93-a76d-fd2b8a196167","Type":"ContainerStarted","Data":"fc9061f76e86c3a4a6252c22f7a45c93b7418f2ec51915273539b5ff5d02ae00"} Nov 29 05:45:12 crc kubenswrapper[4594]: I1129 05:45:12.125062 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d60aa94-3720-43b5-a5de-ca30dd1b63b2","Type":"ContainerStarted","Data":"6c5d7d6b9add4af9ade9a74c6541ee656a7d995bdce1442ea15593434eba643b"} Nov 29 05:45:12 crc kubenswrapper[4594]: I1129 05:45:12.148237 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.837263033 podStartE2EDuration="8.148221936s" podCreationTimestamp="2025-11-29 05:45:04 +0000 UTC" firstStartedPulling="2025-11-29 05:45:05.726989208 +0000 UTC m=+1029.967498419" lastFinishedPulling="2025-11-29 05:45:11.037948102 +0000 UTC m=+1035.278457322" observedRunningTime="2025-11-29 05:45:12.144658962 +0000 UTC m=+1036.385168182" watchObservedRunningTime="2025-11-29 05:45:12.148221936 +0000 UTC m=+1036.388731156" Nov 29 05:45:12 crc kubenswrapper[4594]: I1129 05:45:12.172853 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=3.811031313 podStartE2EDuration="9.172838734s" podCreationTimestamp="2025-11-29 05:45:03 +0000 UTC" firstStartedPulling="2025-11-29 05:45:05.675297255 +0000 UTC m=+1029.915806474" lastFinishedPulling="2025-11-29 05:45:11.037104675 +0000 UTC m=+1035.277613895" observedRunningTime="2025-11-29 05:45:12.165668746 +0000 UTC m=+1036.406177967" watchObservedRunningTime="2025-11-29 05:45:12.172838734 +0000 UTC m=+1036.413347955" Nov 29 05:45:13 crc kubenswrapper[4594]: I1129 05:45:13.219305 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:45:13 crc kubenswrapper[4594]: I1129 05:45:13.350394 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cf77b4997-stk57"] Nov 29 05:45:13 crc kubenswrapper[4594]: I1129 05:45:13.350648 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" podUID="3536563a-1e9e-458e-ace5-acbe9e090404" containerName="dnsmasq-dns" containerID="cri-o://a864656a788effdf9b2da5c2af82abd08d8dfc00247e49a48b84b882074bec97" gracePeriod=10 Nov 29 05:45:13 crc kubenswrapper[4594]: I1129 05:45:13.532022 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 05:45:13 crc kubenswrapper[4594]: I1129 05:45:13.532080 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 05:45:13 crc kubenswrapper[4594]: I1129 05:45:13.536873 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 05:45:13 crc kubenswrapper[4594]: I1129 05:45:13.538499 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 05:45:13 crc kubenswrapper[4594]: I1129 05:45:13.592097 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 05:45:13 crc kubenswrapper[4594]: I1129 05:45:13.599519 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 05:45:13 crc kubenswrapper[4594]: I1129 05:45:13.603477 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 05:45:13 crc kubenswrapper[4594]: I1129 05:45:13.625368 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 05:45:13 crc kubenswrapper[4594]: I1129 05:45:13.774616 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:45:13 crc kubenswrapper[4594]: I1129 05:45:13.894318 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.176604 4594 generic.go:334] "Generic (PLEG): container finished" podID="3536563a-1e9e-458e-ace5-acbe9e090404" containerID="a864656a788effdf9b2da5c2af82abd08d8dfc00247e49a48b84b882074bec97" exitCode=0 Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.176662 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" event={"ID":"3536563a-1e9e-458e-ace5-acbe9e090404","Type":"ContainerDied","Data":"a864656a788effdf9b2da5c2af82abd08d8dfc00247e49a48b84b882074bec97"} Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.177040 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.178763 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.178791 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.178804 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.325842 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.371956 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.426506 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.463055 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.476007 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.476054 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.524726 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.712207 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mvvfl" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.797065 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72edd67c-f301-4e33-8d71-74f9bc7b99c5-combined-ca-bundle\") pod \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\" (UID: \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\") " Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.797118 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdcpx\" (UniqueName: \"kubernetes.io/projected/72edd67c-f301-4e33-8d71-74f9bc7b99c5-kube-api-access-gdcpx\") pod \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\" (UID: \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\") " Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.797215 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72edd67c-f301-4e33-8d71-74f9bc7b99c5-db-sync-config-data\") pod \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\" (UID: \"72edd67c-f301-4e33-8d71-74f9bc7b99c5\") " Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.805715 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72edd67c-f301-4e33-8d71-74f9bc7b99c5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "72edd67c-f301-4e33-8d71-74f9bc7b99c5" (UID: "72edd67c-f301-4e33-8d71-74f9bc7b99c5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.813494 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72edd67c-f301-4e33-8d71-74f9bc7b99c5-kube-api-access-gdcpx" (OuterVolumeSpecName: "kube-api-access-gdcpx") pod "72edd67c-f301-4e33-8d71-74f9bc7b99c5" (UID: "72edd67c-f301-4e33-8d71-74f9bc7b99c5"). InnerVolumeSpecName "kube-api-access-gdcpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.842273 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72edd67c-f301-4e33-8d71-74f9bc7b99c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72edd67c-f301-4e33-8d71-74f9bc7b99c5" (UID: "72edd67c-f301-4e33-8d71-74f9bc7b99c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.900187 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72edd67c-f301-4e33-8d71-74f9bc7b99c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.900233 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdcpx\" (UniqueName: \"kubernetes.io/projected/72edd67c-f301-4e33-8d71-74f9bc7b99c5-kube-api-access-gdcpx\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:14 crc kubenswrapper[4594]: I1129 05:45:14.900269 4594 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72edd67c-f301-4e33-8d71-74f9bc7b99c5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.193677 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mvvfl" event={"ID":"72edd67c-f301-4e33-8d71-74f9bc7b99c5","Type":"ContainerDied","Data":"6b4757af3b10e2441691669b25adccf3ac4e99040c80dd8a9c795dade6cd4e41"} Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.193716 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mvvfl" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.193725 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b4757af3b10e2441691669b25adccf3ac4e99040c80dd8a9c795dade6cd4e41" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.195286 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.199330 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" podUID="3536563a-1e9e-458e-ace5-acbe9e090404" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.156:5353: connect: connection refused" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.205241 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.256449 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.271684 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.800408 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.800470 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.974405 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-c89ff55f4-zl5h6"] Nov 29 05:45:15 crc kubenswrapper[4594]: E1129 05:45:15.974798 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17adfde2-389b-491b-8c00-8293e37021b4" containerName="collect-profiles" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.974818 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="17adfde2-389b-491b-8c00-8293e37021b4" containerName="collect-profiles" Nov 29 05:45:15 crc kubenswrapper[4594]: E1129 05:45:15.974856 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72edd67c-f301-4e33-8d71-74f9bc7b99c5" containerName="barbican-db-sync" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.974862 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="72edd67c-f301-4e33-8d71-74f9bc7b99c5" containerName="barbican-db-sync" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.975036 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="17adfde2-389b-491b-8c00-8293e37021b4" containerName="collect-profiles" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.975068 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="72edd67c-f301-4e33-8d71-74f9bc7b99c5" containerName="barbican-db-sync" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.976082 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.983987 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.984186 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.984410 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wl49q" Nov 29 05:45:15 crc kubenswrapper[4594]: I1129 05:45:15.995173 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c89ff55f4-zl5h6"] Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.024330 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-776755c9f7-9ghn5"] Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.026128 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.030265 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.143083 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/744bbd71-1ab1-492d-9148-37be600ef9c8-logs\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.143318 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-logs\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.143516 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/744bbd71-1ab1-492d-9148-37be600ef9c8-config-data-custom\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.143830 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqpnd\" (UniqueName: \"kubernetes.io/projected/744bbd71-1ab1-492d-9148-37be600ef9c8-kube-api-access-nqpnd\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.143912 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjfww\" (UniqueName: \"kubernetes.io/projected/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-kube-api-access-sjfww\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.144064 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-combined-ca-bundle\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.144222 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/744bbd71-1ab1-492d-9148-37be600ef9c8-combined-ca-bundle\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.144370 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-config-data-custom\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.144486 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-config-data\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.144556 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/744bbd71-1ab1-492d-9148-37be600ef9c8-config-data\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.159909 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-776755c9f7-9ghn5"] Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.160031 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84c68846bf-7r6g7"] Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.162406 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.191581 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7fb8849f48-9rr52" podUID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.197526 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c68846bf-7r6g7"] Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.202389 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-66865fcd76-jchd8"] Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.204536 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.206731 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.229316 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66865fcd76-jchd8"] Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.236745 4594 generic.go:334] "Generic (PLEG): container finished" podID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerID="6c5d7d6b9add4af9ade9a74c6541ee656a7d995bdce1442ea15593434eba643b" exitCode=1 Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.237728 4594 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.237755 4594 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.238150 4594 scope.go:117] "RemoveContainer" containerID="6c5d7d6b9add4af9ade9a74c6541ee656a7d995bdce1442ea15593434eba643b" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.238682 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d60aa94-3720-43b5-a5de-ca30dd1b63b2","Type":"ContainerDied","Data":"6c5d7d6b9add4af9ade9a74c6541ee656a7d995bdce1442ea15593434eba643b"} Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.239422 4594 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.239446 4594 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.251823 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-558d4b85cb-k5j98" podUID="19dfde1f-d770-45ec-8735-78549b8fcb90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.257552 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-config-data-custom\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.257631 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-config-data\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.257656 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/744bbd71-1ab1-492d-9148-37be600ef9c8-config-data\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.257776 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-combined-ca-bundle\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.257809 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/744bbd71-1ab1-492d-9148-37be600ef9c8-logs\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.257828 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4z45\" (UniqueName: \"kubernetes.io/projected/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-kube-api-access-j4z45\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.257846 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-logs\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.257873 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-ovsdbserver-sb\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.257908 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-config-data-custom\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.257932 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-logs\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.257959 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/744bbd71-1ab1-492d-9148-37be600ef9c8-config-data-custom\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.257985 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-config\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.258060 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-dns-svc\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.258087 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-config-data\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.258117 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldrzc\" (UniqueName: \"kubernetes.io/projected/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-kube-api-access-ldrzc\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.258244 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqpnd\" (UniqueName: \"kubernetes.io/projected/744bbd71-1ab1-492d-9148-37be600ef9c8-kube-api-access-nqpnd\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.258306 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjfww\" (UniqueName: \"kubernetes.io/projected/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-kube-api-access-sjfww\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.258352 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-ovsdbserver-nb\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.258399 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-combined-ca-bundle\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.258416 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-dns-swift-storage-0\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.258512 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/744bbd71-1ab1-492d-9148-37be600ef9c8-combined-ca-bundle\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.261269 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/744bbd71-1ab1-492d-9148-37be600ef9c8-logs\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.261545 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-logs\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.263789 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-config-data-custom\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.265717 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-config-data\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.267700 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/744bbd71-1ab1-492d-9148-37be600ef9c8-config-data\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.269023 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/744bbd71-1ab1-492d-9148-37be600ef9c8-combined-ca-bundle\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.272693 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-combined-ca-bundle\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.277054 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/744bbd71-1ab1-492d-9148-37be600ef9c8-config-data-custom\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.278480 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjfww\" (UniqueName: \"kubernetes.io/projected/9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374-kube-api-access-sjfww\") pod \"barbican-worker-776755c9f7-9ghn5\" (UID: \"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374\") " pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.281776 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqpnd\" (UniqueName: \"kubernetes.io/projected/744bbd71-1ab1-492d-9148-37be600ef9c8-kube-api-access-nqpnd\") pod \"barbican-keystone-listener-c89ff55f4-zl5h6\" (UID: \"744bbd71-1ab1-492d-9148-37be600ef9c8\") " pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.306864 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.361559 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-dns-svc\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.361604 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-config-data\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.361631 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldrzc\" (UniqueName: \"kubernetes.io/projected/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-kube-api-access-ldrzc\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.361687 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-ovsdbserver-nb\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.361742 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-dns-swift-storage-0\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.361878 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-combined-ca-bundle\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.361913 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4z45\" (UniqueName: \"kubernetes.io/projected/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-kube-api-access-j4z45\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.361960 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-ovsdbserver-sb\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.361989 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-config-data-custom\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.362017 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-logs\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.362067 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-config\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.363334 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-ovsdbserver-nb\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.363925 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-dns-swift-storage-0\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.364777 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-ovsdbserver-sb\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.365125 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-logs\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.365486 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-dns-svc\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.368038 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-config-data-custom\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.372153 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-config-data\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.372184 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-config\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.379064 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-776755c9f7-9ghn5" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.381177 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-combined-ca-bundle\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.382688 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4z45\" (UniqueName: \"kubernetes.io/projected/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-kube-api-access-j4z45\") pod \"barbican-api-66865fcd76-jchd8\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.386916 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldrzc\" (UniqueName: \"kubernetes.io/projected/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-kube-api-access-ldrzc\") pod \"dnsmasq-dns-84c68846bf-7r6g7\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.490696 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:16 crc kubenswrapper[4594]: I1129 05:45:16.520919 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:17 crc kubenswrapper[4594]: I1129 05:45:17.221440 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 05:45:17 crc kubenswrapper[4594]: I1129 05:45:17.238409 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 05:45:17 crc kubenswrapper[4594]: I1129 05:45:17.284774 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 05:45:17 crc kubenswrapper[4594]: I1129 05:45:17.284886 4594 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 05:45:17 crc kubenswrapper[4594]: I1129 05:45:17.293560 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 05:45:17 crc kubenswrapper[4594]: I1129 05:45:17.975813 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.007174 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-combined-ca-bundle\") pod \"d7cc6b30-a981-4486-9d8a-e926167f001b\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.007278 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-scripts\") pod \"d7cc6b30-a981-4486-9d8a-e926167f001b\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.007360 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qv4b\" (UniqueName: \"kubernetes.io/projected/d7cc6b30-a981-4486-9d8a-e926167f001b-kube-api-access-7qv4b\") pod \"d7cc6b30-a981-4486-9d8a-e926167f001b\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.007433 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-credential-keys\") pod \"d7cc6b30-a981-4486-9d8a-e926167f001b\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.007454 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-fernet-keys\") pod \"d7cc6b30-a981-4486-9d8a-e926167f001b\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.007486 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-config-data\") pod \"d7cc6b30-a981-4486-9d8a-e926167f001b\" (UID: \"d7cc6b30-a981-4486-9d8a-e926167f001b\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.105855 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7cc6b30-a981-4486-9d8a-e926167f001b-kube-api-access-7qv4b" (OuterVolumeSpecName: "kube-api-access-7qv4b") pod "d7cc6b30-a981-4486-9d8a-e926167f001b" (UID: "d7cc6b30-a981-4486-9d8a-e926167f001b"). InnerVolumeSpecName "kube-api-access-7qv4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.136025 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d7cc6b30-a981-4486-9d8a-e926167f001b" (UID: "d7cc6b30-a981-4486-9d8a-e926167f001b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.136653 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8jkv9" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.136669 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-scripts" (OuterVolumeSpecName: "scripts") pod "d7cc6b30-a981-4486-9d8a-e926167f001b" (UID: "d7cc6b30-a981-4486-9d8a-e926167f001b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.138217 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d7cc6b30-a981-4486-9d8a-e926167f001b" (UID: "d7cc6b30-a981-4486-9d8a-e926167f001b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.155227 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.155343 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qv4b\" (UniqueName: \"kubernetes.io/projected/d7cc6b30-a981-4486-9d8a-e926167f001b-kube-api-access-7qv4b\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.155389 4594 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.155403 4594 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.233732 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7cc6b30-a981-4486-9d8a-e926167f001b" (UID: "d7cc6b30-a981-4486-9d8a-e926167f001b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.237271 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-config-data" (OuterVolumeSpecName: "config-data") pod "d7cc6b30-a981-4486-9d8a-e926167f001b" (UID: "d7cc6b30-a981-4486-9d8a-e926167f001b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.256219 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69498abf-8b80-4b7f-901a-4c6a4bdede2f-logs\") pod \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.256409 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-combined-ca-bundle\") pod \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.256521 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-scripts\") pod \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.256558 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk2n4\" (UniqueName: \"kubernetes.io/projected/69498abf-8b80-4b7f-901a-4c6a4bdede2f-kube-api-access-wk2n4\") pod \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.256667 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-config-data\") pod \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\" (UID: \"69498abf-8b80-4b7f-901a-4c6a4bdede2f\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.256665 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69498abf-8b80-4b7f-901a-4c6a4bdede2f-logs" (OuterVolumeSpecName: "logs") pod "69498abf-8b80-4b7f-901a-4c6a4bdede2f" (UID: "69498abf-8b80-4b7f-901a-4c6a4bdede2f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.257171 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69498abf-8b80-4b7f-901a-4c6a4bdede2f-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.257189 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.257200 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7cc6b30-a981-4486-9d8a-e926167f001b-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.264567 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.269571 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8jkv9" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.270400 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-scripts" (OuterVolumeSpecName: "scripts") pod "69498abf-8b80-4b7f-901a-4c6a4bdede2f" (UID: "69498abf-8b80-4b7f-901a-4c6a4bdede2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.271653 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69498abf-8b80-4b7f-901a-4c6a4bdede2f-kube-api-access-wk2n4" (OuterVolumeSpecName: "kube-api-access-wk2n4") pod "69498abf-8b80-4b7f-901a-4c6a4bdede2f" (UID: "69498abf-8b80-4b7f-901a-4c6a4bdede2f"). InnerVolumeSpecName "kube-api-access-wk2n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.286319 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-config-data" (OuterVolumeSpecName: "config-data") pod "69498abf-8b80-4b7f-901a-4c6a4bdede2f" (UID: "69498abf-8b80-4b7f-901a-4c6a4bdede2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.301885 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69498abf-8b80-4b7f-901a-4c6a4bdede2f" (UID: "69498abf-8b80-4b7f-901a-4c6a4bdede2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.321692 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" event={"ID":"3536563a-1e9e-458e-ace5-acbe9e090404","Type":"ContainerDied","Data":"4647490bcb1fe66d1bf64b2c7f0d79c59d2fcbe98515b2d752767c1691db3797"} Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.321746 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4647490bcb1fe66d1bf64b2c7f0d79c59d2fcbe98515b2d752767c1691db3797" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.321760 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gkrjf" event={"ID":"d7cc6b30-a981-4486-9d8a-e926167f001b","Type":"ContainerDied","Data":"77355ba65d78e080d8b8abe611d6572d1d2968913037f7d4c7da5df7c698cd63"} Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.321771 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77355ba65d78e080d8b8abe611d6572d1d2968913037f7d4c7da5df7c698cd63" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.321780 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8jkv9" event={"ID":"69498abf-8b80-4b7f-901a-4c6a4bdede2f","Type":"ContainerDied","Data":"cd5efaf7de51bc1e57db373e421f8c0b47d6f4f9afeabb3b08a725191e5f4601"} Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.321788 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd5efaf7de51bc1e57db373e421f8c0b47d6f4f9afeabb3b08a725191e5f4601" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.359275 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.359371 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.360704 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69498abf-8b80-4b7f-901a-4c6a4bdede2f-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.360727 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk2n4\" (UniqueName: \"kubernetes.io/projected/69498abf-8b80-4b7f-901a-4c6a4bdede2f-kube-api-access-wk2n4\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.449226 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.571988 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-config\") pod \"3536563a-1e9e-458e-ace5-acbe9e090404\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.572075 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-dns-svc\") pod \"3536563a-1e9e-458e-ace5-acbe9e090404\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.572117 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7l2r\" (UniqueName: \"kubernetes.io/projected/3536563a-1e9e-458e-ace5-acbe9e090404-kube-api-access-g7l2r\") pod \"3536563a-1e9e-458e-ace5-acbe9e090404\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.572161 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-ovsdbserver-sb\") pod \"3536563a-1e9e-458e-ace5-acbe9e090404\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.572384 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-dns-swift-storage-0\") pod \"3536563a-1e9e-458e-ace5-acbe9e090404\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.572409 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-ovsdbserver-nb\") pod \"3536563a-1e9e-458e-ace5-acbe9e090404\" (UID: \"3536563a-1e9e-458e-ace5-acbe9e090404\") " Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.595799 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3536563a-1e9e-458e-ace5-acbe9e090404-kube-api-access-g7l2r" (OuterVolumeSpecName: "kube-api-access-g7l2r") pod "3536563a-1e9e-458e-ace5-acbe9e090404" (UID: "3536563a-1e9e-458e-ace5-acbe9e090404"). InnerVolumeSpecName "kube-api-access-g7l2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.651162 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3536563a-1e9e-458e-ace5-acbe9e090404" (UID: "3536563a-1e9e-458e-ace5-acbe9e090404"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.667810 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3536563a-1e9e-458e-ace5-acbe9e090404" (UID: "3536563a-1e9e-458e-ace5-acbe9e090404"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.675070 4594 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.675100 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.675109 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7l2r\" (UniqueName: \"kubernetes.io/projected/3536563a-1e9e-458e-ace5-acbe9e090404-kube-api-access-g7l2r\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.676602 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-config" (OuterVolumeSpecName: "config") pod "3536563a-1e9e-458e-ace5-acbe9e090404" (UID: "3536563a-1e9e-458e-ace5-acbe9e090404"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.696145 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3536563a-1e9e-458e-ace5-acbe9e090404" (UID: "3536563a-1e9e-458e-ace5-acbe9e090404"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.705437 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3536563a-1e9e-458e-ace5-acbe9e090404" (UID: "3536563a-1e9e-458e-ace5-acbe9e090404"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.781338 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.781866 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.784026 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3536563a-1e9e-458e-ace5-acbe9e090404-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.877819 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66865fcd76-jchd8"] Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.891067 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-776755c9f7-9ghn5"] Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.975833 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d8fb9b558-k2gdh"] Nov 29 05:45:18 crc kubenswrapper[4594]: E1129 05:45:18.976425 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3536563a-1e9e-458e-ace5-acbe9e090404" containerName="dnsmasq-dns" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.976442 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3536563a-1e9e-458e-ace5-acbe9e090404" containerName="dnsmasq-dns" Nov 29 05:45:18 crc kubenswrapper[4594]: E1129 05:45:18.976481 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7cc6b30-a981-4486-9d8a-e926167f001b" containerName="keystone-bootstrap" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.976487 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7cc6b30-a981-4486-9d8a-e926167f001b" containerName="keystone-bootstrap" Nov 29 05:45:18 crc kubenswrapper[4594]: E1129 05:45:18.976499 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3536563a-1e9e-458e-ace5-acbe9e090404" containerName="init" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.976504 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3536563a-1e9e-458e-ace5-acbe9e090404" containerName="init" Nov 29 05:45:18 crc kubenswrapper[4594]: E1129 05:45:18.976522 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69498abf-8b80-4b7f-901a-4c6a4bdede2f" containerName="placement-db-sync" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.976528 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="69498abf-8b80-4b7f-901a-4c6a4bdede2f" containerName="placement-db-sync" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.976759 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="3536563a-1e9e-458e-ace5-acbe9e090404" containerName="dnsmasq-dns" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.976778 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="69498abf-8b80-4b7f-901a-4c6a4bdede2f" containerName="placement-db-sync" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.976798 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7cc6b30-a981-4486-9d8a-e926167f001b" containerName="keystone-bootstrap" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.978368 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.989476 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d8fb9b558-k2gdh"] Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.991797 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 29 05:45:18 crc kubenswrapper[4594]: I1129 05:45:18.991845 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.093756 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffk6s\" (UniqueName: \"kubernetes.io/projected/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-kube-api-access-ffk6s\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.093854 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-config-data\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.093875 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-combined-ca-bundle\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.093914 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-internal-tls-certs\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.093931 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-public-tls-certs\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.093967 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-config-data-custom\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.094043 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-logs\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.143535 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c68846bf-7r6g7"] Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.174934 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c89ff55f4-zl5h6"] Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.197156 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-logs\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.197550 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffk6s\" (UniqueName: \"kubernetes.io/projected/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-kube-api-access-ffk6s\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.197696 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-logs\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.197905 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-config-data\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.198002 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-combined-ca-bundle\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.198129 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-internal-tls-certs\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.198194 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-public-tls-certs\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.198307 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-config-data-custom\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.201300 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-87b885ff4-zwt2r"] Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.202921 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.206378 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.206718 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.206894 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.207013 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nz5kq" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.207132 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.208208 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.210381 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-internal-tls-certs\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.210612 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-config-data\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.211108 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-config-data-custom\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.212340 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-public-tls-certs\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.241318 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-87b885ff4-zwt2r"] Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.244114 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-combined-ca-bundle\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.261807 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffk6s\" (UniqueName: \"kubernetes.io/projected/81f68040-3d0b-4f18-85fb-3f29b28c8fbe-kube-api-access-ffk6s\") pod \"barbican-api-d8fb9b558-k2gdh\" (UID: \"81f68040-3d0b-4f18-85fb-3f29b28c8fbe\") " pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.299864 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-fernet-keys\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.299958 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-public-tls-certs\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.300016 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrznt\" (UniqueName: \"kubernetes.io/projected/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-kube-api-access-wrznt\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.300087 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-config-data\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.300199 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-credential-keys\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.300221 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-combined-ca-bundle\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.300266 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-internal-tls-certs\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.300361 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-scripts\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.318303 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55c5775b88-9dvcb"] Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.320223 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.325207 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.325437 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-m8j6t" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.325599 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.325751 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.325875 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.330500 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55c5775b88-9dvcb"] Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.332649 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.370714 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d60aa94-3720-43b5-a5de-ca30dd1b63b2","Type":"ContainerStarted","Data":"ce07c65467f1a09735ac28b33a41432403765fe428167c2bcd92e7ee2bfe40ad"} Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.393779 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfe84a63-fea5-455e-95d3-523a091b976f","Type":"ContainerStarted","Data":"82a7be62d336acb1c166e710c849a936fe0c5c2dc0533148ba8167ea8bbc75b0"} Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.397184 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-776755c9f7-9ghn5" event={"ID":"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374","Type":"ContainerStarted","Data":"68df2fa5bccf35dd543ad6b9bb7b549fbf7801212336ea8b89617fd7acdc61ee"} Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.399902 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" event={"ID":"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c","Type":"ContainerStarted","Data":"265bbd42514568578cd8bec846db87cb8b10bf0cec7b9cfa3974714a4a640859"} Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.402576 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66865fcd76-jchd8" event={"ID":"dfcba285-f64f-471e-a2eb-4b38b2a35b3c","Type":"ContainerStarted","Data":"59f067181f0f4b2928c91d193891dbcabcbb8d53a55a5fc3a7c53308cf2df132"} Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.405205 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf77b4997-stk57" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.405373 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" event={"ID":"744bbd71-1ab1-492d-9148-37be600ef9c8","Type":"ContainerStarted","Data":"02ab6f272ebd38e66be2f0d0511e77c4f2eb4071748ef1f3ce25d9a8cda24687"} Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.416554 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-fernet-keys\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.416670 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ecbb59-1c3f-4561-a175-ffbd99d0496f-logs\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.416705 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-combined-ca-bundle\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.416782 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-public-tls-certs\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.416839 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-config-data\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.416883 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrznt\" (UniqueName: \"kubernetes.io/projected/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-kube-api-access-wrznt\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.416990 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-config-data\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.418514 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-scripts\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.418567 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm2jk\" (UniqueName: \"kubernetes.io/projected/95ecbb59-1c3f-4561-a175-ffbd99d0496f-kube-api-access-nm2jk\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.418607 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-internal-tls-certs\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.418633 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-credential-keys\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.418663 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-combined-ca-bundle\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.418716 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-internal-tls-certs\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.418802 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-public-tls-certs\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.418906 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-scripts\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.424805 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-fernet-keys\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.437294 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-internal-tls-certs\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.443425 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-config-data\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.444537 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-public-tls-certs\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.444747 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-credential-keys\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.447414 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrznt\" (UniqueName: \"kubernetes.io/projected/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-kube-api-access-wrznt\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.448653 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-combined-ca-bundle\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.478313 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e56e5f5-25c3-4bbb-a9ca-47aec5d22564-scripts\") pod \"keystone-87b885ff4-zwt2r\" (UID: \"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564\") " pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.524191 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-public-tls-certs\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.524399 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ecbb59-1c3f-4561-a175-ffbd99d0496f-logs\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.524437 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-combined-ca-bundle\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.524507 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-config-data\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.524692 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-scripts\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.524740 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm2jk\" (UniqueName: \"kubernetes.io/projected/95ecbb59-1c3f-4561-a175-ffbd99d0496f-kube-api-access-nm2jk\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.524778 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-internal-tls-certs\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.527991 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ecbb59-1c3f-4561-a175-ffbd99d0496f-logs\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.529454 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-public-tls-certs\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.536078 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-config-data\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.536151 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cf77b4997-stk57"] Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.541826 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-internal-tls-certs\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.544893 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-scripts\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.545988 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ecbb59-1c3f-4561-a175-ffbd99d0496f-combined-ca-bundle\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.546979 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm2jk\" (UniqueName: \"kubernetes.io/projected/95ecbb59-1c3f-4561-a175-ffbd99d0496f-kube-api-access-nm2jk\") pod \"placement-55c5775b88-9dvcb\" (UID: \"95ecbb59-1c3f-4561-a175-ffbd99d0496f\") " pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.549815 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cf77b4997-stk57"] Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.564772 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.668968 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:19 crc kubenswrapper[4594]: I1129 05:45:19.903992 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d8fb9b558-k2gdh"] Nov 29 05:45:19 crc kubenswrapper[4594]: W1129 05:45:19.908222 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81f68040_3d0b_4f18_85fb_3f29b28c8fbe.slice/crio-16f02df8912fc31ff8e2c4a1df2906572c1122311ba1f589e99fa78e655ae5eb WatchSource:0}: Error finding container 16f02df8912fc31ff8e2c4a1df2906572c1122311ba1f589e99fa78e655ae5eb: Status 404 returned error can't find the container with id 16f02df8912fc31ff8e2c4a1df2906572c1122311ba1f589e99fa78e655ae5eb Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.104918 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3536563a-1e9e-458e-ace5-acbe9e090404" path="/var/lib/kubelet/pods/3536563a-1e9e-458e-ace5-acbe9e090404/volumes" Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.137134 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-87b885ff4-zwt2r"] Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.292701 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55c5775b88-9dvcb"] Nov 29 05:45:20 crc kubenswrapper[4594]: W1129 05:45:20.298155 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95ecbb59_1c3f_4561_a175_ffbd99d0496f.slice/crio-573fd3f2927ed37086b0e053422ecd8dcc25070a07f56d18409a2b5bd4d90218 WatchSource:0}: Error finding container 573fd3f2927ed37086b0e053422ecd8dcc25070a07f56d18409a2b5bd4d90218: Status 404 returned error can't find the container with id 573fd3f2927ed37086b0e053422ecd8dcc25070a07f56d18409a2b5bd4d90218 Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.423975 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9rddc" event={"ID":"4b46224c-7874-4c4a-abb6-f1cbef3a8462","Type":"ContainerStarted","Data":"e8e19a6129ae5d861b384fa3d50896c39d975243eddae5b7537018fd968c04f2"} Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.433619 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d8fb9b558-k2gdh" event={"ID":"81f68040-3d0b-4f18-85fb-3f29b28c8fbe","Type":"ContainerStarted","Data":"6b295c4125cb0f036b02846872f090f6824f27217680f250f4738b1f8cb24fe5"} Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.433668 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d8fb9b558-k2gdh" event={"ID":"81f68040-3d0b-4f18-85fb-3f29b28c8fbe","Type":"ContainerStarted","Data":"16f02df8912fc31ff8e2c4a1df2906572c1122311ba1f589e99fa78e655ae5eb"} Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.450594 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9rddc" podStartSLOduration=4.204855007 podStartE2EDuration="47.450580732s" podCreationTimestamp="2025-11-29 05:44:33 +0000 UTC" firstStartedPulling="2025-11-29 05:44:34.720726776 +0000 UTC m=+998.961235995" lastFinishedPulling="2025-11-29 05:45:17.9664525 +0000 UTC m=+1042.206961720" observedRunningTime="2025-11-29 05:45:20.449467918 +0000 UTC m=+1044.689977138" watchObservedRunningTime="2025-11-29 05:45:20.450580732 +0000 UTC m=+1044.691089952" Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.456494 4594 generic.go:334] "Generic (PLEG): container finished" podID="f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" containerID="e7bf409814f934d44c93eea45caa6ad5fa41e5fc89de103903bf0095a0e95498" exitCode=0 Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.456574 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" event={"ID":"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c","Type":"ContainerDied","Data":"e7bf409814f934d44c93eea45caa6ad5fa41e5fc89de103903bf0095a0e95498"} Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.463480 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66865fcd76-jchd8" event={"ID":"dfcba285-f64f-471e-a2eb-4b38b2a35b3c","Type":"ContainerStarted","Data":"be156d5e811f7697d4e71003f07b9d8d4c96fa8233e4a54ed77aa2e726dfc22a"} Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.463510 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66865fcd76-jchd8" event={"ID":"dfcba285-f64f-471e-a2eb-4b38b2a35b3c","Type":"ContainerStarted","Data":"81e4b892f5e5238485ce5a6f37cf2e0156f1c7ca179226284cd2afd8995e8b0a"} Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.463547 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.464020 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.475242 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-87b885ff4-zwt2r" event={"ID":"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564","Type":"ContainerStarted","Data":"c88ba8e6ed6794eb1970bf772e229768472224a64a63d3af36005d0cc3dedc39"} Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.486522 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55c5775b88-9dvcb" event={"ID":"95ecbb59-1c3f-4561-a175-ffbd99d0496f","Type":"ContainerStarted","Data":"573fd3f2927ed37086b0e053422ecd8dcc25070a07f56d18409a2b5bd4d90218"} Nov 29 05:45:20 crc kubenswrapper[4594]: I1129 05:45:20.527733 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-66865fcd76-jchd8" podStartSLOduration=4.52768968 podStartE2EDuration="4.52768968s" podCreationTimestamp="2025-11-29 05:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:20.52300112 +0000 UTC m=+1044.763510341" watchObservedRunningTime="2025-11-29 05:45:20.52768968 +0000 UTC m=+1044.768198901" Nov 29 05:45:21 crc kubenswrapper[4594]: I1129 05:45:21.512029 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d8fb9b558-k2gdh" event={"ID":"81f68040-3d0b-4f18-85fb-3f29b28c8fbe","Type":"ContainerStarted","Data":"55842da54c1d0215d16f009c8589d9817d93c4888f642df3922beeba3f8124ae"} Nov 29 05:45:21 crc kubenswrapper[4594]: I1129 05:45:21.512688 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:21 crc kubenswrapper[4594]: I1129 05:45:21.512704 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:21 crc kubenswrapper[4594]: I1129 05:45:21.514999 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" event={"ID":"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c","Type":"ContainerStarted","Data":"7d06ce876e418b7180d721b44fd78e07afeaee58f9f0487f63e6e2be9db0060b"} Nov 29 05:45:21 crc kubenswrapper[4594]: I1129 05:45:21.516051 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:21 crc kubenswrapper[4594]: I1129 05:45:21.523673 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-87b885ff4-zwt2r" event={"ID":"2e56e5f5-25c3-4bbb-a9ca-47aec5d22564","Type":"ContainerStarted","Data":"331ef6e337c98378c48827f86511f03250ac92d23cd5cb44f72af9e98a57f2a3"} Nov 29 05:45:21 crc kubenswrapper[4594]: I1129 05:45:21.523841 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:21 crc kubenswrapper[4594]: I1129 05:45:21.538608 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55c5775b88-9dvcb" event={"ID":"95ecbb59-1c3f-4561-a175-ffbd99d0496f","Type":"ContainerStarted","Data":"e2748110538f45e07218acb86bb3aa65f2a5609975728cd019adfe925196cfb1"} Nov 29 05:45:21 crc kubenswrapper[4594]: I1129 05:45:21.540458 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d8fb9b558-k2gdh" podStartSLOduration=3.540427296 podStartE2EDuration="3.540427296s" podCreationTimestamp="2025-11-29 05:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:21.526403835 +0000 UTC m=+1045.766913056" watchObservedRunningTime="2025-11-29 05:45:21.540427296 +0000 UTC m=+1045.780936517" Nov 29 05:45:21 crc kubenswrapper[4594]: I1129 05:45:21.547688 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-87b885ff4-zwt2r" podStartSLOduration=2.547668259 podStartE2EDuration="2.547668259s" podCreationTimestamp="2025-11-29 05:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:21.541407751 +0000 UTC m=+1045.781916981" watchObservedRunningTime="2025-11-29 05:45:21.547668259 +0000 UTC m=+1045.788177479" Nov 29 05:45:21 crc kubenswrapper[4594]: I1129 05:45:21.563008 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" podStartSLOduration=5.562991955 podStartE2EDuration="5.562991955s" podCreationTimestamp="2025-11-29 05:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:21.558782026 +0000 UTC m=+1045.799291266" watchObservedRunningTime="2025-11-29 05:45:21.562991955 +0000 UTC m=+1045.803501175" Nov 29 05:45:22 crc kubenswrapper[4594]: I1129 05:45:22.466318 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Nov 29 05:45:22 crc kubenswrapper[4594]: I1129 05:45:22.466577 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="ba31669d-d47f-41bc-8476-7acf2a872d15" containerName="watcher-api-log" containerID="cri-o://30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7" gracePeriod=30 Nov 29 05:45:22 crc kubenswrapper[4594]: I1129 05:45:22.466736 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="ba31669d-d47f-41bc-8476-7acf2a872d15" containerName="watcher-api" containerID="cri-o://707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497" gracePeriod=30 Nov 29 05:45:22 crc kubenswrapper[4594]: I1129 05:45:22.552026 4594 generic.go:334] "Generic (PLEG): container finished" podID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerID="ce07c65467f1a09735ac28b33a41432403765fe428167c2bcd92e7ee2bfe40ad" exitCode=1 Nov 29 05:45:22 crc kubenswrapper[4594]: I1129 05:45:22.552272 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d60aa94-3720-43b5-a5de-ca30dd1b63b2","Type":"ContainerDied","Data":"ce07c65467f1a09735ac28b33a41432403765fe428167c2bcd92e7ee2bfe40ad"} Nov 29 05:45:22 crc kubenswrapper[4594]: I1129 05:45:22.552438 4594 scope.go:117] "RemoveContainer" containerID="6c5d7d6b9add4af9ade9a74c6541ee656a7d995bdce1442ea15593434eba643b" Nov 29 05:45:22 crc kubenswrapper[4594]: I1129 05:45:22.553290 4594 scope.go:117] "RemoveContainer" containerID="ce07c65467f1a09735ac28b33a41432403765fe428167c2bcd92e7ee2bfe40ad" Nov 29 05:45:22 crc kubenswrapper[4594]: E1129 05:45:22.553635 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3d60aa94-3720-43b5-a5de-ca30dd1b63b2)\"" pod="openstack/watcher-decision-engine-0" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" Nov 29 05:45:22 crc kubenswrapper[4594]: I1129 05:45:22.556535 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55c5775b88-9dvcb" event={"ID":"95ecbb59-1c3f-4561-a175-ffbd99d0496f","Type":"ContainerStarted","Data":"88f2c85c6175dcafd85000dc616355a54cf999ca70a0d6935517209543a03fe5"} Nov 29 05:45:22 crc kubenswrapper[4594]: I1129 05:45:22.592425 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55c5775b88-9dvcb" podStartSLOduration=3.592403138 podStartE2EDuration="3.592403138s" podCreationTimestamp="2025-11-29 05:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:22.585814795 +0000 UTC m=+1046.826324015" watchObservedRunningTime="2025-11-29 05:45:22.592403138 +0000 UTC m=+1046.832912359" Nov 29 05:45:22 crc kubenswrapper[4594]: I1129 05:45:22.729357 4594 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podfc29fae8-4b90-4702-8c68-4cec263fd4c9"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podfc29fae8-4b90-4702-8c68-4cec263fd4c9] : Timed out while waiting for systemd to remove kubepods-besteffort-podfc29fae8_4b90_4702_8c68_4cec263fd4c9.slice" Nov 29 05:45:22 crc kubenswrapper[4594]: E1129 05:45:22.729411 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podfc29fae8-4b90-4702-8c68-4cec263fd4c9] : unable to destroy cgroup paths for cgroup [kubepods besteffort podfc29fae8-4b90-4702-8c68-4cec263fd4c9] : Timed out while waiting for systemd to remove kubepods-besteffort-podfc29fae8_4b90_4702_8c68_4cec263fd4c9.slice" pod="openstack/keystone-bootstrap-749tk" podUID="fc29fae8-4b90-4702-8c68-4cec263fd4c9" Nov 29 05:45:22 crc kubenswrapper[4594]: I1129 05:45:22.749542 4594 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod968317f6-c4d9-4647-a166-0cadc0fa57f2"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod968317f6-c4d9-4647-a166-0cadc0fa57f2] : Timed out while waiting for systemd to remove kubepods-besteffort-pod968317f6_c4d9_4647_a166_0cadc0fa57f2.slice" Nov 29 05:45:22 crc kubenswrapper[4594]: E1129 05:45:22.749594 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod968317f6-c4d9-4647-a166-0cadc0fa57f2] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod968317f6-c4d9-4647-a166-0cadc0fa57f2] : Timed out while waiting for systemd to remove kubepods-besteffort-pod968317f6_c4d9_4647_a166_0cadc0fa57f2.slice" pod="openstack/neutron-db-sync-7b4h2" podUID="968317f6-c4d9-4647-a166-0cadc0fa57f2" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.293527 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.344478 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.461678 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-config-data\") pod \"ba31669d-d47f-41bc-8476-7acf2a872d15\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.461782 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba31669d-d47f-41bc-8476-7acf2a872d15-logs\") pod \"ba31669d-d47f-41bc-8476-7acf2a872d15\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.461852 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-custom-prometheus-ca\") pod \"ba31669d-d47f-41bc-8476-7acf2a872d15\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.461923 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58qnb\" (UniqueName: \"kubernetes.io/projected/ba31669d-d47f-41bc-8476-7acf2a872d15-kube-api-access-58qnb\") pod \"ba31669d-d47f-41bc-8476-7acf2a872d15\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.462148 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-combined-ca-bundle\") pod \"ba31669d-d47f-41bc-8476-7acf2a872d15\" (UID: \"ba31669d-d47f-41bc-8476-7acf2a872d15\") " Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.465582 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba31669d-d47f-41bc-8476-7acf2a872d15-logs" (OuterVolumeSpecName: "logs") pod "ba31669d-d47f-41bc-8476-7acf2a872d15" (UID: "ba31669d-d47f-41bc-8476-7acf2a872d15"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.487355 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba31669d-d47f-41bc-8476-7acf2a872d15-kube-api-access-58qnb" (OuterVolumeSpecName: "kube-api-access-58qnb") pod "ba31669d-d47f-41bc-8476-7acf2a872d15" (UID: "ba31669d-d47f-41bc-8476-7acf2a872d15"). InnerVolumeSpecName "kube-api-access-58qnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.535967 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba31669d-d47f-41bc-8476-7acf2a872d15" (UID: "ba31669d-d47f-41bc-8476-7acf2a872d15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.565214 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.565267 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba31669d-d47f-41bc-8476-7acf2a872d15-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.565280 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58qnb\" (UniqueName: \"kubernetes.io/projected/ba31669d-d47f-41bc-8476-7acf2a872d15-kube-api-access-58qnb\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.575835 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-config-data" (OuterVolumeSpecName: "config-data") pod "ba31669d-d47f-41bc-8476-7acf2a872d15" (UID: "ba31669d-d47f-41bc-8476-7acf2a872d15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.577415 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "ba31669d-d47f-41bc-8476-7acf2a872d15" (UID: "ba31669d-d47f-41bc-8476-7acf2a872d15"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.592476 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-776755c9f7-9ghn5" event={"ID":"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374","Type":"ContainerStarted","Data":"a88b0e557550ab7e5a3a8dff8b839c413cafaedee3bf517137c7e1f99dfa52a2"} Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.592526 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-776755c9f7-9ghn5" event={"ID":"9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374","Type":"ContainerStarted","Data":"4709af0aaa8b0aa710104070bf1556774b8a52d02a8f3bc9a1da5f72500c2d25"} Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.595958 4594 generic.go:334] "Generic (PLEG): container finished" podID="ba31669d-d47f-41bc-8476-7acf2a872d15" containerID="707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497" exitCode=0 Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.595989 4594 generic.go:334] "Generic (PLEG): container finished" podID="ba31669d-d47f-41bc-8476-7acf2a872d15" containerID="30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7" exitCode=143 Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.596041 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ba31669d-d47f-41bc-8476-7acf2a872d15","Type":"ContainerDied","Data":"707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497"} Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.596068 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ba31669d-d47f-41bc-8476-7acf2a872d15","Type":"ContainerDied","Data":"30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7"} Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.596078 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ba31669d-d47f-41bc-8476-7acf2a872d15","Type":"ContainerDied","Data":"3497f3820c6a3775147afba51b6d933e1e59dc64f3e791434f549f0cc2d9e3bb"} Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.596092 4594 scope.go:117] "RemoveContainer" containerID="707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.596199 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.612132 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" event={"ID":"744bbd71-1ab1-492d-9148-37be600ef9c8","Type":"ContainerStarted","Data":"69ec9363be0bbc7f29bb61c5c8c6f564ff856859ade5b6bba7e0c240aece3610"} Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.612185 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" event={"ID":"744bbd71-1ab1-492d-9148-37be600ef9c8","Type":"ContainerStarted","Data":"9807aad1e54a3008215bb5060a2108b3567cfeed367bf053515b6c1d25401974"} Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.619122 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-749tk" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.620219 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.620285 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7b4h2" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.620420 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.627159 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-776755c9f7-9ghn5" podStartSLOduration=4.874166178 podStartE2EDuration="8.627143057s" podCreationTimestamp="2025-11-29 05:45:15 +0000 UTC" firstStartedPulling="2025-11-29 05:45:19.004179938 +0000 UTC m=+1043.244689158" lastFinishedPulling="2025-11-29 05:45:22.757156817 +0000 UTC m=+1046.997666037" observedRunningTime="2025-11-29 05:45:23.614466739 +0000 UTC m=+1047.854975960" watchObservedRunningTime="2025-11-29 05:45:23.627143057 +0000 UTC m=+1047.867652276" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.654914 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.656024 4594 scope.go:117] "RemoveContainer" containerID="30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.667688 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.669230 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.669271 4594 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ba31669d-d47f-41bc-8476-7acf2a872d15-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.677329 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Nov 29 05:45:23 crc kubenswrapper[4594]: E1129 05:45:23.677896 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba31669d-d47f-41bc-8476-7acf2a872d15" containerName="watcher-api-log" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.677918 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba31669d-d47f-41bc-8476-7acf2a872d15" containerName="watcher-api-log" Nov 29 05:45:23 crc kubenswrapper[4594]: E1129 05:45:23.677959 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba31669d-d47f-41bc-8476-7acf2a872d15" containerName="watcher-api" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.677964 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba31669d-d47f-41bc-8476-7acf2a872d15" containerName="watcher-api" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.678241 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba31669d-d47f-41bc-8476-7acf2a872d15" containerName="watcher-api" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.678277 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba31669d-d47f-41bc-8476-7acf2a872d15" containerName="watcher-api-log" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.678844 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-c89ff55f4-zl5h6" podStartSLOduration=5.125298392 podStartE2EDuration="8.678826394s" podCreationTimestamp="2025-11-29 05:45:15 +0000 UTC" firstStartedPulling="2025-11-29 05:45:19.204415986 +0000 UTC m=+1043.444925207" lastFinishedPulling="2025-11-29 05:45:22.757943988 +0000 UTC m=+1046.998453209" observedRunningTime="2025-11-29 05:45:23.658599994 +0000 UTC m=+1047.899109214" watchObservedRunningTime="2025-11-29 05:45:23.678826394 +0000 UTC m=+1047.919335615" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.679493 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.684678 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.685030 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.686010 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.710212 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.736608 4594 scope.go:117] "RemoveContainer" containerID="707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497" Nov 29 05:45:23 crc kubenswrapper[4594]: E1129 05:45:23.737262 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497\": container with ID starting with 707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497 not found: ID does not exist" containerID="707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.737372 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497"} err="failed to get container status \"707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497\": rpc error: code = NotFound desc = could not find container \"707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497\": container with ID starting with 707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497 not found: ID does not exist" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.737455 4594 scope.go:117] "RemoveContainer" containerID="30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7" Nov 29 05:45:23 crc kubenswrapper[4594]: E1129 05:45:23.742647 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7\": container with ID starting with 30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7 not found: ID does not exist" containerID="30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.742720 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7"} err="failed to get container status \"30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7\": rpc error: code = NotFound desc = could not find container \"30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7\": container with ID starting with 30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7 not found: ID does not exist" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.742748 4594 scope.go:117] "RemoveContainer" containerID="707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.743274 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497"} err="failed to get container status \"707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497\": rpc error: code = NotFound desc = could not find container \"707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497\": container with ID starting with 707df14a791d700e4e12a9b592b7838821411ea4693ac827f07dcf5d6d5e9497 not found: ID does not exist" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.743387 4594 scope.go:117] "RemoveContainer" containerID="30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.743658 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7"} err="failed to get container status \"30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7\": rpc error: code = NotFound desc = could not find container \"30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7\": container with ID starting with 30d067be360e339d61b0a8f0b5ba4dbb187a6a5aaa62e4345eaf567f287699b7 not found: ID does not exist" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.772111 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.772484 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-public-tls-certs\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.772654 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.772739 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-config-data\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.772817 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.772865 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnqq8\" (UniqueName: \"kubernetes.io/projected/4082da7a-cc96-4b67-a101-48600e49712b-kube-api-access-wnqq8\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.772941 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4082da7a-cc96-4b67-a101-48600e49712b-logs\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.876095 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4082da7a-cc96-4b67-a101-48600e49712b-logs\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.876147 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.876172 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-public-tls-certs\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.876239 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.876297 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-config-data\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.876333 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.876380 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnqq8\" (UniqueName: \"kubernetes.io/projected/4082da7a-cc96-4b67-a101-48600e49712b-kube-api-access-wnqq8\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.876980 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4082da7a-cc96-4b67-a101-48600e49712b-logs\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.882352 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-public-tls-certs\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.882694 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.885791 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.885902 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.889522 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4082da7a-cc96-4b67-a101-48600e49712b-config-data\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:23 crc kubenswrapper[4594]: I1129 05:45:23.895769 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnqq8\" (UniqueName: \"kubernetes.io/projected/4082da7a-cc96-4b67-a101-48600e49712b-kube-api-access-wnqq8\") pod \"watcher-api-0\" (UID: \"4082da7a-cc96-4b67-a101-48600e49712b\") " pod="openstack/watcher-api-0" Nov 29 05:45:24 crc kubenswrapper[4594]: I1129 05:45:24.063523 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Nov 29 05:45:24 crc kubenswrapper[4594]: I1129 05:45:24.120472 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba31669d-d47f-41bc-8476-7acf2a872d15" path="/var/lib/kubelet/pods/ba31669d-d47f-41bc-8476-7acf2a872d15/volumes" Nov 29 05:45:24 crc kubenswrapper[4594]: I1129 05:45:24.326368 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Nov 29 05:45:24 crc kubenswrapper[4594]: I1129 05:45:24.326651 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Nov 29 05:45:24 crc kubenswrapper[4594]: I1129 05:45:24.327325 4594 scope.go:117] "RemoveContainer" containerID="ce07c65467f1a09735ac28b33a41432403765fe428167c2bcd92e7ee2bfe40ad" Nov 29 05:45:24 crc kubenswrapper[4594]: E1129 05:45:24.327620 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3d60aa94-3720-43b5-a5de-ca30dd1b63b2)\"" pod="openstack/watcher-decision-engine-0" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" Nov 29 05:45:24 crc kubenswrapper[4594]: I1129 05:45:24.539975 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Nov 29 05:45:24 crc kubenswrapper[4594]: I1129 05:45:24.630760 4594 generic.go:334] "Generic (PLEG): container finished" podID="4b46224c-7874-4c4a-abb6-f1cbef3a8462" containerID="e8e19a6129ae5d861b384fa3d50896c39d975243eddae5b7537018fd968c04f2" exitCode=0 Nov 29 05:45:24 crc kubenswrapper[4594]: I1129 05:45:24.631548 4594 scope.go:117] "RemoveContainer" containerID="ce07c65467f1a09735ac28b33a41432403765fe428167c2bcd92e7ee2bfe40ad" Nov 29 05:45:24 crc kubenswrapper[4594]: E1129 05:45:24.631742 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3d60aa94-3720-43b5-a5de-ca30dd1b63b2)\"" pod="openstack/watcher-decision-engine-0" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" Nov 29 05:45:24 crc kubenswrapper[4594]: I1129 05:45:24.632024 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9rddc" event={"ID":"4b46224c-7874-4c4a-abb6-f1cbef3a8462","Type":"ContainerDied","Data":"e8e19a6129ae5d861b384fa3d50896c39d975243eddae5b7537018fd968c04f2"} Nov 29 05:45:25 crc kubenswrapper[4594]: I1129 05:45:25.737279 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5c7df66d69-hd8nh" Nov 29 05:45:25 crc kubenswrapper[4594]: I1129 05:45:25.790956 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66bbc5d8dd-fzbhl"] Nov 29 05:45:25 crc kubenswrapper[4594]: I1129 05:45:25.791284 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66bbc5d8dd-fzbhl" podUID="860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" containerName="neutron-api" containerID="cri-o://dbe13314cb482f0e31502e38204649a154044a8792e73d9bcdadd76c3ad1a649" gracePeriod=30 Nov 29 05:45:25 crc kubenswrapper[4594]: I1129 05:45:25.791355 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66bbc5d8dd-fzbhl" podUID="860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" containerName="neutron-httpd" containerID="cri-o://76ce69fbb890971373cf2b97b55d27c09c8d857179fc93efd2f64f8e76bc3b7e" gracePeriod=30 Nov 29 05:45:26 crc kubenswrapper[4594]: I1129 05:45:26.493401 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:26 crc kubenswrapper[4594]: I1129 05:45:26.623758 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b86667d5-d25cf"] Nov 29 05:45:26 crc kubenswrapper[4594]: I1129 05:45:26.624413 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" podUID="f39fdd2d-1f5e-440f-996c-08ae2e749d3e" containerName="dnsmasq-dns" containerID="cri-o://45619c64119c3a13ccbf693a78ab7548d824f21f6d3ee0dc4ad484f5a286877f" gracePeriod=10 Nov 29 05:45:26 crc kubenswrapper[4594]: I1129 05:45:26.706684 4594 generic.go:334] "Generic (PLEG): container finished" podID="860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" containerID="76ce69fbb890971373cf2b97b55d27c09c8d857179fc93efd2f64f8e76bc3b7e" exitCode=0 Nov 29 05:45:26 crc kubenswrapper[4594]: I1129 05:45:26.706743 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66bbc5d8dd-fzbhl" event={"ID":"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be","Type":"ContainerDied","Data":"76ce69fbb890971373cf2b97b55d27c09c8d857179fc93efd2f64f8e76bc3b7e"} Nov 29 05:45:27 crc kubenswrapper[4594]: I1129 05:45:27.726480 4594 generic.go:334] "Generic (PLEG): container finished" podID="f39fdd2d-1f5e-440f-996c-08ae2e749d3e" containerID="45619c64119c3a13ccbf693a78ab7548d824f21f6d3ee0dc4ad484f5a286877f" exitCode=0 Nov 29 05:45:27 crc kubenswrapper[4594]: I1129 05:45:27.726584 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" event={"ID":"f39fdd2d-1f5e-440f-996c-08ae2e749d3e","Type":"ContainerDied","Data":"45619c64119c3a13ccbf693a78ab7548d824f21f6d3ee0dc4ad484f5a286877f"} Nov 29 05:45:27 crc kubenswrapper[4594]: I1129 05:45:27.759459 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:28 crc kubenswrapper[4594]: I1129 05:45:28.119852 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:28 crc kubenswrapper[4594]: I1129 05:45:28.167783 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:28 crc kubenswrapper[4594]: I1129 05:45:28.217521 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" podUID="f39fdd2d-1f5e-440f-996c-08ae2e749d3e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.163:5353: connect: connection refused" Nov 29 05:45:28 crc kubenswrapper[4594]: I1129 05:45:28.635372 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55c5775b88-9dvcb" Nov 29 05:45:28 crc kubenswrapper[4594]: I1129 05:45:28.761443 4594 generic.go:334] "Generic (PLEG): container finished" podID="860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" containerID="dbe13314cb482f0e31502e38204649a154044a8792e73d9bcdadd76c3ad1a649" exitCode=0 Nov 29 05:45:28 crc kubenswrapper[4594]: I1129 05:45:28.762327 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66bbc5d8dd-fzbhl" event={"ID":"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be","Type":"ContainerDied","Data":"dbe13314cb482f0e31502e38204649a154044a8792e73d9bcdadd76c3ad1a649"} Nov 29 05:45:28 crc kubenswrapper[4594]: I1129 05:45:28.794447 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:45:28 crc kubenswrapper[4594]: I1129 05:45:28.854143 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.564402 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:45:30 crc kubenswrapper[4594]: W1129 05:45:30.572623 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4082da7a_cc96_4b67_a101_48600e49712b.slice/crio-c7bc022ab52b0d7acfb0073eec7fa09d44e63be14e08b071b3e2f79cf7b2d81e WatchSource:0}: Error finding container c7bc022ab52b0d7acfb0073eec7fa09d44e63be14e08b071b3e2f79cf7b2d81e: Status 404 returned error can't find the container with id c7bc022ab52b0d7acfb0073eec7fa09d44e63be14e08b071b3e2f79cf7b2d81e Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.681934 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9rddc" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.792574 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.805111 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"4082da7a-cc96-4b67-a101-48600e49712b","Type":"ContainerStarted","Data":"c7bc022ab52b0d7acfb0073eec7fa09d44e63be14e08b071b3e2f79cf7b2d81e"} Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.807007 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9rddc" event={"ID":"4b46224c-7874-4c4a-abb6-f1cbef3a8462","Type":"ContainerDied","Data":"ec150e75644d27f26f84fef87057b1ec0dd6566f679884e2c1d94cf46eec4bc4"} Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.807049 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec150e75644d27f26f84fef87057b1ec0dd6566f679884e2c1d94cf46eec4bc4" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.807111 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9rddc" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.873782 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26m6r\" (UniqueName: \"kubernetes.io/projected/4b46224c-7874-4c4a-abb6-f1cbef3a8462-kube-api-access-26m6r\") pod \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.873905 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-db-sync-config-data\") pod \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.873933 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-scripts\") pod \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.873971 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-config-data\") pod \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.874054 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b46224c-7874-4c4a-abb6-f1cbef3a8462-etc-machine-id\") pod \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.874242 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-combined-ca-bundle\") pod \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\" (UID: \"4b46224c-7874-4c4a-abb6-f1cbef3a8462\") " Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.886521 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b46224c-7874-4c4a-abb6-f1cbef3a8462-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4b46224c-7874-4c4a-abb6-f1cbef3a8462" (UID: "4b46224c-7874-4c4a-abb6-f1cbef3a8462"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.891360 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-scripts" (OuterVolumeSpecName: "scripts") pod "4b46224c-7874-4c4a-abb6-f1cbef3a8462" (UID: "4b46224c-7874-4c4a-abb6-f1cbef3a8462"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.893397 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4b46224c-7874-4c4a-abb6-f1cbef3a8462" (UID: "4b46224c-7874-4c4a-abb6-f1cbef3a8462"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.896428 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b46224c-7874-4c4a-abb6-f1cbef3a8462-kube-api-access-26m6r" (OuterVolumeSpecName: "kube-api-access-26m6r") pod "4b46224c-7874-4c4a-abb6-f1cbef3a8462" (UID: "4b46224c-7874-4c4a-abb6-f1cbef3a8462"). InnerVolumeSpecName "kube-api-access-26m6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.910452 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b46224c-7874-4c4a-abb6-f1cbef3a8462" (UID: "4b46224c-7874-4c4a-abb6-f1cbef3a8462"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.910872 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-558d4b85cb-k5j98" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.957203 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d8fb9b558-k2gdh" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.970934 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-config-data" (OuterVolumeSpecName: "config-data") pod "4b46224c-7874-4c4a-abb6-f1cbef3a8462" (UID: "4b46224c-7874-4c4a-abb6-f1cbef3a8462"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.977216 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.977239 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26m6r\" (UniqueName: \"kubernetes.io/projected/4b46224c-7874-4c4a-abb6-f1cbef3a8462-kube-api-access-26m6r\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.977669 4594 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.977684 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.977693 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b46224c-7874-4c4a-abb6-f1cbef3a8462-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.977703 4594 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b46224c-7874-4c4a-abb6-f1cbef3a8462-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.983525 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7fb8849f48-9rr52"] Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.983778 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7fb8849f48-9rr52" podUID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerName="horizon-log" containerID="cri-o://5ac36bd5450c58ccb44f66fa2fc8a56f6b7dd613a59651336f4d79e162161576" gracePeriod=30 Nov 29 05:45:30 crc kubenswrapper[4594]: I1129 05:45:30.984166 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7fb8849f48-9rr52" podUID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerName="horizon" containerID="cri-o://0df5f632c7bce5417787cef5afd154fcfdf04ab056713da4bdcdd13461f932ca" gracePeriod=30 Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.043130 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-66865fcd76-jchd8"] Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.043375 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-66865fcd76-jchd8" podUID="dfcba285-f64f-471e-a2eb-4b38b2a35b3c" containerName="barbican-api-log" containerID="cri-o://81e4b892f5e5238485ce5a6f37cf2e0156f1c7ca179226284cd2afd8995e8b0a" gracePeriod=30 Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.043483 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-66865fcd76-jchd8" podUID="dfcba285-f64f-471e-a2eb-4b38b2a35b3c" containerName="barbican-api" containerID="cri-o://be156d5e811f7697d4e71003f07b9d8d4c96fa8233e4a54ed77aa2e726dfc22a" gracePeriod=30 Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.760873 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.788892 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.830411 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" event={"ID":"f39fdd2d-1f5e-440f-996c-08ae2e749d3e","Type":"ContainerDied","Data":"4b49af9bfd9c385e25c3940a8c8db196d252c9cf4f20f0c2c0e9c00fa4ca834b"} Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.830477 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b86667d5-d25cf" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.830497 4594 scope.go:117] "RemoveContainer" containerID="45619c64119c3a13ccbf693a78ab7548d824f21f6d3ee0dc4ad484f5a286877f" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.843708 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66bbc5d8dd-fzbhl" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.844552 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66bbc5d8dd-fzbhl" event={"ID":"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be","Type":"ContainerDied","Data":"fed7d5cea1157666e5f6c9c5e8caeb1eb905bb4c456b268b79cea70ea1240266"} Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.853738 4594 generic.go:334] "Generic (PLEG): container finished" podID="dfcba285-f64f-471e-a2eb-4b38b2a35b3c" containerID="81e4b892f5e5238485ce5a6f37cf2e0156f1c7ca179226284cd2afd8995e8b0a" exitCode=143 Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.854794 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66865fcd76-jchd8" event={"ID":"dfcba285-f64f-471e-a2eb-4b38b2a35b3c","Type":"ContainerDied","Data":"81e4b892f5e5238485ce5a6f37cf2e0156f1c7ca179226284cd2afd8995e8b0a"} Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.872359 4594 scope.go:117] "RemoveContainer" containerID="fb7d3a9347a69aee0ab41a123cf4140d30540c8c94392eb9202bf28875a59765" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.911752 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-httpd-config\") pod \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.911802 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-ovsdbserver-nb\") pod \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.911841 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-config\") pod \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.911861 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-config\") pod \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.911921 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-dns-swift-storage-0\") pod \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.911997 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-ovndb-tls-certs\") pod \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.912027 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6zrw\" (UniqueName: \"kubernetes.io/projected/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-kube-api-access-x6zrw\") pod \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.912132 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtgh7\" (UniqueName: \"kubernetes.io/projected/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-kube-api-access-mtgh7\") pod \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.912174 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-combined-ca-bundle\") pod \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\" (UID: \"860e92d3-234a-46aa-b6e1-ea0ffbb8a4be\") " Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.912291 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-ovsdbserver-sb\") pod \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.912309 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-dns-svc\") pod \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\" (UID: \"f39fdd2d-1f5e-440f-996c-08ae2e749d3e\") " Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.924168 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66865fcd76-jchd8" podUID="dfcba285-f64f-471e-a2eb-4b38b2a35b3c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.175:9311/healthcheck\": read tcp 10.217.0.2:34502->10.217.0.175:9311: read: connection reset by peer" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.924464 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66865fcd76-jchd8" podUID="dfcba285-f64f-471e-a2eb-4b38b2a35b3c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.175:9311/healthcheck\": read tcp 10.217.0.2:34488->10.217.0.175:9311: read: connection reset by peer" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.932402 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" (UID: "860e92d3-234a-46aa-b6e1-ea0ffbb8a4be"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.941425 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-kube-api-access-mtgh7" (OuterVolumeSpecName: "kube-api-access-mtgh7") pod "f39fdd2d-1f5e-440f-996c-08ae2e749d3e" (UID: "f39fdd2d-1f5e-440f-996c-08ae2e749d3e"). InnerVolumeSpecName "kube-api-access-mtgh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.941530 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-kube-api-access-x6zrw" (OuterVolumeSpecName: "kube-api-access-x6zrw") pod "860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" (UID: "860e92d3-234a-46aa-b6e1-ea0ffbb8a4be"). InnerVolumeSpecName "kube-api-access-x6zrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.982154 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 05:45:31 crc kubenswrapper[4594]: E1129 05:45:31.982681 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" containerName="neutron-httpd" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.982699 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" containerName="neutron-httpd" Nov 29 05:45:31 crc kubenswrapper[4594]: E1129 05:45:31.982712 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39fdd2d-1f5e-440f-996c-08ae2e749d3e" containerName="init" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.982718 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39fdd2d-1f5e-440f-996c-08ae2e749d3e" containerName="init" Nov 29 05:45:31 crc kubenswrapper[4594]: E1129 05:45:31.982742 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" containerName="neutron-api" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.982748 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" containerName="neutron-api" Nov 29 05:45:31 crc kubenswrapper[4594]: E1129 05:45:31.982769 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39fdd2d-1f5e-440f-996c-08ae2e749d3e" containerName="dnsmasq-dns" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.982775 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39fdd2d-1f5e-440f-996c-08ae2e749d3e" containerName="dnsmasq-dns" Nov 29 05:45:31 crc kubenswrapper[4594]: E1129 05:45:31.982798 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b46224c-7874-4c4a-abb6-f1cbef3a8462" containerName="cinder-db-sync" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.982804 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b46224c-7874-4c4a-abb6-f1cbef3a8462" containerName="cinder-db-sync" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.983051 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" containerName="neutron-api" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.983059 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" containerName="neutron-httpd" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.983076 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b46224c-7874-4c4a-abb6-f1cbef3a8462" containerName="cinder-db-sync" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.983086 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f39fdd2d-1f5e-440f-996c-08ae2e749d3e" containerName="dnsmasq-dns" Nov 29 05:45:31 crc kubenswrapper[4594]: I1129 05:45:31.984406 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.001294 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.002086 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-v5t9v" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.002208 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.002431 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.015382 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtgh7\" (UniqueName: \"kubernetes.io/projected/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-kube-api-access-mtgh7\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.015409 4594 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.015419 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6zrw\" (UniqueName: \"kubernetes.io/projected/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-kube-api-access-x6zrw\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.043491 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.055817 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f39fdd2d-1f5e-440f-996c-08ae2e749d3e" (UID: "f39fdd2d-1f5e-440f-996c-08ae2e749d3e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.064509 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75958fc765-4tc8j"] Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.066332 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.079890 4594 scope.go:117] "RemoveContainer" containerID="76ce69fbb890971373cf2b97b55d27c09c8d857179fc93efd2f64f8e76bc3b7e" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.133737 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f39fdd2d-1f5e-440f-996c-08ae2e749d3e" (UID: "f39fdd2d-1f5e-440f-996c-08ae2e749d3e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.142554 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.142739 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-config-data\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.143338 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.143539 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4495d878-ec89-4a13-8adb-a64316eb2e68-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.143629 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-config" (OuterVolumeSpecName: "config") pod "f39fdd2d-1f5e-440f-996c-08ae2e749d3e" (UID: "f39fdd2d-1f5e-440f-996c-08ae2e749d3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.143740 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-scripts\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.143881 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl75p\" (UniqueName: \"kubernetes.io/projected/4495d878-ec89-4a13-8adb-a64316eb2e68-kube-api-access-kl75p\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.146413 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.146506 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.146573 4594 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.147431 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-4tc8j"] Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.172467 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f39fdd2d-1f5e-440f-996c-08ae2e749d3e" (UID: "f39fdd2d-1f5e-440f-996c-08ae2e749d3e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.176349 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-config" (OuterVolumeSpecName: "config") pod "860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" (UID: "860e92d3-234a-46aa-b6e1-ea0ffbb8a4be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.178732 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" (UID: "860e92d3-234a-46aa-b6e1-ea0ffbb8a4be"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.186745 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f39fdd2d-1f5e-440f-996c-08ae2e749d3e" (UID: "f39fdd2d-1f5e-440f-996c-08ae2e749d3e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.188018 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" (UID: "860e92d3-234a-46aa-b6e1-ea0ffbb8a4be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.204340 4594 scope.go:117] "RemoveContainer" containerID="dbe13314cb482f0e31502e38204649a154044a8792e73d9bcdadd76c3ad1a649" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.249185 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.249523 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4495d878-ec89-4a13-8adb-a64316eb2e68-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.249601 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-ovsdbserver-sb\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.249642 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-scripts\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.249699 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-dns-svc\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.249755 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl75p\" (UniqueName: \"kubernetes.io/projected/4495d878-ec89-4a13-8adb-a64316eb2e68-kube-api-access-kl75p\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.249792 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-dns-swift-storage-0\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.249818 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.249924 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-config-data\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.249635 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4495d878-ec89-4a13-8adb-a64316eb2e68-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.251702 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-config\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.251881 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-ovsdbserver-nb\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.251929 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzxsl\" (UniqueName: \"kubernetes.io/projected/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-kube-api-access-zzxsl\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.252058 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.252072 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f39fdd2d-1f5e-440f-996c-08ae2e749d3e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.252083 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.252095 4594 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.252105 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.252402 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.252457 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-scripts\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.265217 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.276582 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl75p\" (UniqueName: \"kubernetes.io/projected/4495d878-ec89-4a13-8adb-a64316eb2e68-kube-api-access-kl75p\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.277323 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-config-data\") pod \"cinder-scheduler-0\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.298765 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.300695 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.303224 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.313466 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.333984 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.355033 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-ovsdbserver-nb\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.355080 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzxsl\" (UniqueName: \"kubernetes.io/projected/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-kube-api-access-zzxsl\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.355174 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-ovsdbserver-sb\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.355244 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-dns-svc\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.355318 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-dns-swift-storage-0\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.355525 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-config\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.356514 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-config\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.357037 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-ovsdbserver-nb\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.357810 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-ovsdbserver-sb\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.358347 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-dns-svc\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.358825 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-dns-swift-storage-0\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.422219 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzxsl\" (UniqueName: \"kubernetes.io/projected/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-kube-api-access-zzxsl\") pod \"dnsmasq-dns-75958fc765-4tc8j\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.463392 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd076bd-1374-4b36-b5b2-c42f207016fd-logs\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.463442 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-scripts\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.463463 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.463538 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfd076bd-1374-4b36-b5b2-c42f207016fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.463564 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7bsk\" (UniqueName: \"kubernetes.io/projected/bfd076bd-1374-4b36-b5b2-c42f207016fd-kube-api-access-h7bsk\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.463594 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-config-data\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.463622 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.569022 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfd076bd-1374-4b36-b5b2-c42f207016fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.569094 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7bsk\" (UniqueName: \"kubernetes.io/projected/bfd076bd-1374-4b36-b5b2-c42f207016fd-kube-api-access-h7bsk\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.569156 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-config-data\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.569236 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.569391 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd076bd-1374-4b36-b5b2-c42f207016fd-logs\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.569426 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-scripts\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.569448 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.570655 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b86667d5-d25cf"] Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.578579 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfd076bd-1374-4b36-b5b2-c42f207016fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.579414 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd076bd-1374-4b36-b5b2-c42f207016fd-logs\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.580846 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.587090 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.592492 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.610010 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-config-data\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.619821 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7bsk\" (UniqueName: \"kubernetes.io/projected/bfd076bd-1374-4b36-b5b2-c42f207016fd-kube-api-access-h7bsk\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.623706 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-scripts\") pod \"cinder-api-0\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.629292 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b86667d5-d25cf"] Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.635451 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.695141 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66bbc5d8dd-fzbhl"] Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.721450 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-66bbc5d8dd-fzbhl"] Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.723244 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.779265 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-logs\") pod \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.779461 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4z45\" (UniqueName: \"kubernetes.io/projected/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-kube-api-access-j4z45\") pod \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.780177 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-logs" (OuterVolumeSpecName: "logs") pod "dfcba285-f64f-471e-a2eb-4b38b2a35b3c" (UID: "dfcba285-f64f-471e-a2eb-4b38b2a35b3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.780798 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-config-data\") pod \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.780842 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-config-data-custom\") pod \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.780898 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-combined-ca-bundle\") pod \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\" (UID: \"dfcba285-f64f-471e-a2eb-4b38b2a35b3c\") " Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.781713 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.795554 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-kube-api-access-j4z45" (OuterVolumeSpecName: "kube-api-access-j4z45") pod "dfcba285-f64f-471e-a2eb-4b38b2a35b3c" (UID: "dfcba285-f64f-471e-a2eb-4b38b2a35b3c"). InnerVolumeSpecName "kube-api-access-j4z45". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.802420 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dfcba285-f64f-471e-a2eb-4b38b2a35b3c" (UID: "dfcba285-f64f-471e-a2eb-4b38b2a35b3c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.824902 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfcba285-f64f-471e-a2eb-4b38b2a35b3c" (UID: "dfcba285-f64f-471e-a2eb-4b38b2a35b3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.878071 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"4082da7a-cc96-4b67-a101-48600e49712b","Type":"ContainerStarted","Data":"8d3ae3a665ae10837794ce268e4cb392ea3275246107862b300647990d5baa96"} Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.884202 4594 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.884233 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.884246 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4z45\" (UniqueName: \"kubernetes.io/projected/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-kube-api-access-j4z45\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.885468 4594 generic.go:334] "Generic (PLEG): container finished" podID="dfcba285-f64f-471e-a2eb-4b38b2a35b3c" containerID="be156d5e811f7697d4e71003f07b9d8d4c96fa8233e4a54ed77aa2e726dfc22a" exitCode=0 Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.885537 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66865fcd76-jchd8" event={"ID":"dfcba285-f64f-471e-a2eb-4b38b2a35b3c","Type":"ContainerDied","Data":"be156d5e811f7697d4e71003f07b9d8d4c96fa8233e4a54ed77aa2e726dfc22a"} Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.885570 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66865fcd76-jchd8" event={"ID":"dfcba285-f64f-471e-a2eb-4b38b2a35b3c","Type":"ContainerDied","Data":"59f067181f0f4b2928c91d193891dbcabcbb8d53a55a5fc3a7c53308cf2df132"} Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.885587 4594 scope.go:117] "RemoveContainer" containerID="be156d5e811f7697d4e71003f07b9d8d4c96fa8233e4a54ed77aa2e726dfc22a" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.885712 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66865fcd76-jchd8" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.888307 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8849f48-9rr52" event={"ID":"e8d01e45-1d76-464f-93e6-965d65a055fc","Type":"ContainerDied","Data":"0df5f632c7bce5417787cef5afd154fcfdf04ab056713da4bdcdd13461f932ca"} Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.888319 4594 generic.go:334] "Generic (PLEG): container finished" podID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerID="0df5f632c7bce5417787cef5afd154fcfdf04ab056713da4bdcdd13461f932ca" exitCode=0 Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.900642 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfe84a63-fea5-455e-95d3-523a091b976f","Type":"ContainerStarted","Data":"d16c7dfef88944e5affe7fb57fc6f7a7ff275273002c78196b3f3b7c77545b96"} Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.900851 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="ceilometer-central-agent" containerID="cri-o://8b910ba656191f153188fed85a2fe961876dddee8993298e48bca1621bd41c04" gracePeriod=30 Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.900960 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.901346 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="proxy-httpd" containerID="cri-o://d16c7dfef88944e5affe7fb57fc6f7a7ff275273002c78196b3f3b7c77545b96" gracePeriod=30 Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.901510 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="sg-core" containerID="cri-o://82a7be62d336acb1c166e710c849a936fe0c5c2dc0533148ba8167ea8bbc75b0" gracePeriod=30 Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.901564 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="ceilometer-notification-agent" containerID="cri-o://fc49a19c1a1fc2d60ca494b34631f397817d2181fa37036731550878d86b8601" gracePeriod=30 Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.929433 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.701487417 podStartE2EDuration="59.929401347s" podCreationTimestamp="2025-11-29 05:44:33 +0000 UTC" firstStartedPulling="2025-11-29 05:44:34.407045842 +0000 UTC m=+998.647555063" lastFinishedPulling="2025-11-29 05:45:31.634959773 +0000 UTC m=+1055.875468993" observedRunningTime="2025-11-29 05:45:32.921547653 +0000 UTC m=+1057.162056873" watchObservedRunningTime="2025-11-29 05:45:32.929401347 +0000 UTC m=+1057.169910568" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.944751 4594 scope.go:117] "RemoveContainer" containerID="81e4b892f5e5238485ce5a6f37cf2e0156f1c7ca179226284cd2afd8995e8b0a" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.946939 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-config-data" (OuterVolumeSpecName: "config-data") pod "dfcba285-f64f-471e-a2eb-4b38b2a35b3c" (UID: "dfcba285-f64f-471e-a2eb-4b38b2a35b3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.987849 4594 scope.go:117] "RemoveContainer" containerID="be156d5e811f7697d4e71003f07b9d8d4c96fa8233e4a54ed77aa2e726dfc22a" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.988769 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfcba285-f64f-471e-a2eb-4b38b2a35b3c-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:32 crc kubenswrapper[4594]: E1129 05:45:32.990470 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be156d5e811f7697d4e71003f07b9d8d4c96fa8233e4a54ed77aa2e726dfc22a\": container with ID starting with be156d5e811f7697d4e71003f07b9d8d4c96fa8233e4a54ed77aa2e726dfc22a not found: ID does not exist" containerID="be156d5e811f7697d4e71003f07b9d8d4c96fa8233e4a54ed77aa2e726dfc22a" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.990517 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be156d5e811f7697d4e71003f07b9d8d4c96fa8233e4a54ed77aa2e726dfc22a"} err="failed to get container status \"be156d5e811f7697d4e71003f07b9d8d4c96fa8233e4a54ed77aa2e726dfc22a\": rpc error: code = NotFound desc = could not find container \"be156d5e811f7697d4e71003f07b9d8d4c96fa8233e4a54ed77aa2e726dfc22a\": container with ID starting with be156d5e811f7697d4e71003f07b9d8d4c96fa8233e4a54ed77aa2e726dfc22a not found: ID does not exist" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.990548 4594 scope.go:117] "RemoveContainer" containerID="81e4b892f5e5238485ce5a6f37cf2e0156f1c7ca179226284cd2afd8995e8b0a" Nov 29 05:45:32 crc kubenswrapper[4594]: E1129 05:45:32.990937 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81e4b892f5e5238485ce5a6f37cf2e0156f1c7ca179226284cd2afd8995e8b0a\": container with ID starting with 81e4b892f5e5238485ce5a6f37cf2e0156f1c7ca179226284cd2afd8995e8b0a not found: ID does not exist" containerID="81e4b892f5e5238485ce5a6f37cf2e0156f1c7ca179226284cd2afd8995e8b0a" Nov 29 05:45:32 crc kubenswrapper[4594]: I1129 05:45:32.990954 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e4b892f5e5238485ce5a6f37cf2e0156f1c7ca179226284cd2afd8995e8b0a"} err="failed to get container status \"81e4b892f5e5238485ce5a6f37cf2e0156f1c7ca179226284cd2afd8995e8b0a\": rpc error: code = NotFound desc = could not find container \"81e4b892f5e5238485ce5a6f37cf2e0156f1c7ca179226284cd2afd8995e8b0a\": container with ID starting with 81e4b892f5e5238485ce5a6f37cf2e0156f1c7ca179226284cd2afd8995e8b0a not found: ID does not exist" Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.144743 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.235076 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-66865fcd76-jchd8"] Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.281336 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-66865fcd76-jchd8"] Nov 29 05:45:33 crc kubenswrapper[4594]: W1129 05:45:33.300880 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfd076bd_1374_4b36_b5b2_c42f207016fd.slice/crio-bca4d4700aa8114c3ccb2e550fccdea3a39fe5f4c475d60986f8dbf1bd03d11d WatchSource:0}: Error finding container bca4d4700aa8114c3ccb2e550fccdea3a39fe5f4c475d60986f8dbf1bd03d11d: Status 404 returned error can't find the container with id bca4d4700aa8114c3ccb2e550fccdea3a39fe5f4c475d60986f8dbf1bd03d11d Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.301451 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.452403 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-4tc8j"] Nov 29 05:45:33 crc kubenswrapper[4594]: W1129 05:45:33.486645 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cf50aee_5c49_4d8b_9fd9_8b7afd376f41.slice/crio-5cefd7ae7bc42d9e45e7828df0cf6c9feea056c97e2f31c8268febef3b7503fd WatchSource:0}: Error finding container 5cefd7ae7bc42d9e45e7828df0cf6c9feea056c97e2f31c8268febef3b7503fd: Status 404 returned error can't find the container with id 5cefd7ae7bc42d9e45e7828df0cf6c9feea056c97e2f31c8268febef3b7503fd Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.890000 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.912329 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4495d878-ec89-4a13-8adb-a64316eb2e68","Type":"ContainerStarted","Data":"a4d3f97ffd84f2678df3d61a8e24a321e36011636bb5c1bd856ce0a244bf79e6"} Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.915518 4594 generic.go:334] "Generic (PLEG): container finished" podID="1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" containerID="170ac117cdf78269bcb9a2c3c6edbba842e165a1dd5f2c25b3de78a089bb7197" exitCode=0 Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.915680 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-4tc8j" event={"ID":"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41","Type":"ContainerDied","Data":"170ac117cdf78269bcb9a2c3c6edbba842e165a1dd5f2c25b3de78a089bb7197"} Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.915805 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-4tc8j" event={"ID":"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41","Type":"ContainerStarted","Data":"5cefd7ae7bc42d9e45e7828df0cf6c9feea056c97e2f31c8268febef3b7503fd"} Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.917967 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfd076bd-1374-4b36-b5b2-c42f207016fd","Type":"ContainerStarted","Data":"bca4d4700aa8114c3ccb2e550fccdea3a39fe5f4c475d60986f8dbf1bd03d11d"} Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.930646 4594 generic.go:334] "Generic (PLEG): container finished" podID="dfe84a63-fea5-455e-95d3-523a091b976f" containerID="d16c7dfef88944e5affe7fb57fc6f7a7ff275273002c78196b3f3b7c77545b96" exitCode=0 Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.930757 4594 generic.go:334] "Generic (PLEG): container finished" podID="dfe84a63-fea5-455e-95d3-523a091b976f" containerID="82a7be62d336acb1c166e710c849a936fe0c5c2dc0533148ba8167ea8bbc75b0" exitCode=2 Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.930823 4594 generic.go:334] "Generic (PLEG): container finished" podID="dfe84a63-fea5-455e-95d3-523a091b976f" containerID="8b910ba656191f153188fed85a2fe961876dddee8993298e48bca1621bd41c04" exitCode=0 Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.930927 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfe84a63-fea5-455e-95d3-523a091b976f","Type":"ContainerDied","Data":"d16c7dfef88944e5affe7fb57fc6f7a7ff275273002c78196b3f3b7c77545b96"} Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.931002 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfe84a63-fea5-455e-95d3-523a091b976f","Type":"ContainerDied","Data":"82a7be62d336acb1c166e710c849a936fe0c5c2dc0533148ba8167ea8bbc75b0"} Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.931063 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfe84a63-fea5-455e-95d3-523a091b976f","Type":"ContainerDied","Data":"8b910ba656191f153188fed85a2fe961876dddee8993298e48bca1621bd41c04"} Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.940054 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"4082da7a-cc96-4b67-a101-48600e49712b","Type":"ContainerStarted","Data":"b6648e17696dea55081e9bd5066e8088b3a4df594deb6ffd3631b0eb8ab06324"} Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.940492 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.945589 4594 generic.go:334] "Generic (PLEG): container finished" podID="9ac3735b-d8a9-4723-8576-dc16c7af5756" containerID="c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1" exitCode=137 Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.945623 4594 generic.go:334] "Generic (PLEG): container finished" podID="9ac3735b-d8a9-4723-8576-dc16c7af5756" containerID="cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7" exitCode=137 Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.945652 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cf4cc7b4f-hg8dz" event={"ID":"9ac3735b-d8a9-4723-8576-dc16c7af5756","Type":"ContainerDied","Data":"c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1"} Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.945686 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cf4cc7b4f-hg8dz" event={"ID":"9ac3735b-d8a9-4723-8576-dc16c7af5756","Type":"ContainerDied","Data":"cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7"} Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.945696 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cf4cc7b4f-hg8dz" event={"ID":"9ac3735b-d8a9-4723-8576-dc16c7af5756","Type":"ContainerDied","Data":"bbb875a88884e4d67cb7992573aef55f60d72dbd03fcd968d00f19a3d3da277e"} Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.945716 4594 scope.go:117] "RemoveContainer" containerID="c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1" Nov 29 05:45:33 crc kubenswrapper[4594]: I1129 05:45:33.945866 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cf4cc7b4f-hg8dz" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.014308 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ac3735b-d8a9-4723-8576-dc16c7af5756-horizon-secret-key\") pod \"9ac3735b-d8a9-4723-8576-dc16c7af5756\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.014764 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ac3735b-d8a9-4723-8576-dc16c7af5756-logs\") pod \"9ac3735b-d8a9-4723-8576-dc16c7af5756\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.015454 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac3735b-d8a9-4723-8576-dc16c7af5756-logs" (OuterVolumeSpecName: "logs") pod "9ac3735b-d8a9-4723-8576-dc16c7af5756" (UID: "9ac3735b-d8a9-4723-8576-dc16c7af5756"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.015567 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ac3735b-d8a9-4723-8576-dc16c7af5756-config-data\") pod \"9ac3735b-d8a9-4723-8576-dc16c7af5756\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.015598 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ac3735b-d8a9-4723-8576-dc16c7af5756-scripts\") pod \"9ac3735b-d8a9-4723-8576-dc16c7af5756\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.016069 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fvwr\" (UniqueName: \"kubernetes.io/projected/9ac3735b-d8a9-4723-8576-dc16c7af5756-kube-api-access-6fvwr\") pod \"9ac3735b-d8a9-4723-8576-dc16c7af5756\" (UID: \"9ac3735b-d8a9-4723-8576-dc16c7af5756\") " Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.017616 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ac3735b-d8a9-4723-8576-dc16c7af5756-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.018780 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac3735b-d8a9-4723-8576-dc16c7af5756-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9ac3735b-d8a9-4723-8576-dc16c7af5756" (UID: "9ac3735b-d8a9-4723-8576-dc16c7af5756"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.021913 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac3735b-d8a9-4723-8576-dc16c7af5756-kube-api-access-6fvwr" (OuterVolumeSpecName: "kube-api-access-6fvwr") pod "9ac3735b-d8a9-4723-8576-dc16c7af5756" (UID: "9ac3735b-d8a9-4723-8576-dc16c7af5756"). InnerVolumeSpecName "kube-api-access-6fvwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.041230 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac3735b-d8a9-4723-8576-dc16c7af5756-config-data" (OuterVolumeSpecName: "config-data") pod "9ac3735b-d8a9-4723-8576-dc16c7af5756" (UID: "9ac3735b-d8a9-4723-8576-dc16c7af5756"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.050995 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac3735b-d8a9-4723-8576-dc16c7af5756-scripts" (OuterVolumeSpecName: "scripts") pod "9ac3735b-d8a9-4723-8576-dc16c7af5756" (UID: "9ac3735b-d8a9-4723-8576-dc16c7af5756"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.064662 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.064717 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.100220 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="860e92d3-234a-46aa-b6e1-ea0ffbb8a4be" path="/var/lib/kubelet/pods/860e92d3-234a-46aa-b6e1-ea0ffbb8a4be/volumes" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.100962 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfcba285-f64f-471e-a2eb-4b38b2a35b3c" path="/var/lib/kubelet/pods/dfcba285-f64f-471e-a2eb-4b38b2a35b3c/volumes" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.101577 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f39fdd2d-1f5e-440f-996c-08ae2e749d3e" path="/var/lib/kubelet/pods/f39fdd2d-1f5e-440f-996c-08ae2e749d3e/volumes" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.121026 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ac3735b-d8a9-4723-8576-dc16c7af5756-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.121074 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ac3735b-d8a9-4723-8576-dc16c7af5756-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.121085 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fvwr\" (UniqueName: \"kubernetes.io/projected/9ac3735b-d8a9-4723-8576-dc16c7af5756-kube-api-access-6fvwr\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.121098 4594 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ac3735b-d8a9-4723-8576-dc16c7af5756-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.128666 4594 scope.go:117] "RemoveContainer" containerID="cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.239990 4594 scope.go:117] "RemoveContainer" containerID="c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1" Nov 29 05:45:34 crc kubenswrapper[4594]: E1129 05:45:34.240545 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1\": container with ID starting with c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1 not found: ID does not exist" containerID="c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.240580 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1"} err="failed to get container status \"c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1\": rpc error: code = NotFound desc = could not find container \"c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1\": container with ID starting with c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1 not found: ID does not exist" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.240603 4594 scope.go:117] "RemoveContainer" containerID="cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7" Nov 29 05:45:34 crc kubenswrapper[4594]: E1129 05:45:34.240990 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7\": container with ID starting with cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7 not found: ID does not exist" containerID="cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.241011 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7"} err="failed to get container status \"cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7\": rpc error: code = NotFound desc = could not find container \"cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7\": container with ID starting with cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7 not found: ID does not exist" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.241027 4594 scope.go:117] "RemoveContainer" containerID="c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.241401 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1"} err="failed to get container status \"c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1\": rpc error: code = NotFound desc = could not find container \"c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1\": container with ID starting with c779ec66fec66188c65a57a98fb7502d621a31fb4a19980fd4945d5ba4c967e1 not found: ID does not exist" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.241447 4594 scope.go:117] "RemoveContainer" containerID="cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.242710 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7"} err="failed to get container status \"cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7\": rpc error: code = NotFound desc = could not find container \"cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7\": container with ID starting with cdd6b473e7211578bb12623c83ba72829a586f190d64dd6207737e5e807240b7 not found: ID does not exist" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.261731 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=11.261712393 podStartE2EDuration="11.261712393s" podCreationTimestamp="2025-11-29 05:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:33.973099888 +0000 UTC m=+1058.213609108" watchObservedRunningTime="2025-11-29 05:45:34.261712393 +0000 UTC m=+1058.502221614" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.269722 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cf4cc7b4f-hg8dz"] Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.276096 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6cf4cc7b4f-hg8dz"] Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.325668 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.326322 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.327819 4594 scope.go:117] "RemoveContainer" containerID="ce07c65467f1a09735ac28b33a41432403765fe428167c2bcd92e7ee2bfe40ad" Nov 29 05:45:34 crc kubenswrapper[4594]: I1129 05:45:34.382593 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.033800 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-4tc8j" event={"ID":"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41","Type":"ContainerStarted","Data":"a3e0c234e95de47b62615205f877e5484af6665111fbf38df64e2dc1f3b3e095"} Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.034730 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.059355 4594 generic.go:334] "Generic (PLEG): container finished" podID="e786ace8-0d46-488a-941c-2325002c5edc" containerID="49db1a73a6aa6d53e47df61ed8cc87575dc878c77038b38da031e4b677a8a131" exitCode=137 Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.059384 4594 generic.go:334] "Generic (PLEG): container finished" podID="e786ace8-0d46-488a-941c-2325002c5edc" containerID="5ebe1b1187197d5f07df1ccb9f78604659450fccdeb6a5cc9fda337a0746d245" exitCode=137 Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.059435 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-658d8f8767-q5w5m" event={"ID":"e786ace8-0d46-488a-941c-2325002c5edc","Type":"ContainerDied","Data":"49db1a73a6aa6d53e47df61ed8cc87575dc878c77038b38da031e4b677a8a131"} Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.059459 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-658d8f8767-q5w5m" event={"ID":"e786ace8-0d46-488a-941c-2325002c5edc","Type":"ContainerDied","Data":"5ebe1b1187197d5f07df1ccb9f78604659450fccdeb6a5cc9fda337a0746d245"} Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.076390 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/watcher-api-0" podUID="4082da7a-cc96-4b67-a101-48600e49712b" containerName="watcher-api-log" probeResult="failure" output="Get \"https://10.217.0.179:9322/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.084017 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75958fc765-4tc8j" podStartSLOduration=4.083994971 podStartE2EDuration="4.083994971s" podCreationTimestamp="2025-11-29 05:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:35.068390905 +0000 UTC m=+1059.308900125" watchObservedRunningTime="2025-11-29 05:45:35.083994971 +0000 UTC m=+1059.324504191" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.091588 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d60aa94-3720-43b5-a5de-ca30dd1b63b2","Type":"ContainerStarted","Data":"327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5"} Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.102513 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfd076bd-1374-4b36-b5b2-c42f207016fd","Type":"ContainerStarted","Data":"f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af"} Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.105720 4594 generic.go:334] "Generic (PLEG): container finished" podID="9c4f4aa8-f25f-4684-8707-3bc0eb168954" containerID="8d5eb8d12b40086236a654309921e326c5ea36c0cb093bef3e74044402fe4a45" exitCode=137 Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.105758 4594 generic.go:334] "Generic (PLEG): container finished" podID="9c4f4aa8-f25f-4684-8707-3bc0eb168954" containerID="c63f69a44e3afa1d1ac33136814d6b792d5e9273aa32f43e16c6771170da44e1" exitCode=137 Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.105798 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bf6f9df7f-knsk7" event={"ID":"9c4f4aa8-f25f-4684-8707-3bc0eb168954","Type":"ContainerDied","Data":"8d5eb8d12b40086236a654309921e326c5ea36c0cb093bef3e74044402fe4a45"} Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.105817 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bf6f9df7f-knsk7" event={"ID":"9c4f4aa8-f25f-4684-8707-3bc0eb168954","Type":"ContainerDied","Data":"c63f69a44e3afa1d1ac33136814d6b792d5e9273aa32f43e16c6771170da44e1"} Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.208396 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.216272 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.355743 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e786ace8-0d46-488a-941c-2325002c5edc-horizon-secret-key\") pod \"e786ace8-0d46-488a-941c-2325002c5edc\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.355784 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c4f4aa8-f25f-4684-8707-3bc0eb168954-config-data\") pod \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.355869 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e786ace8-0d46-488a-941c-2325002c5edc-scripts\") pod \"e786ace8-0d46-488a-941c-2325002c5edc\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.355898 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nbzs\" (UniqueName: \"kubernetes.io/projected/9c4f4aa8-f25f-4684-8707-3bc0eb168954-kube-api-access-6nbzs\") pod \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.356066 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c4f4aa8-f25f-4684-8707-3bc0eb168954-scripts\") pod \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.356136 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e786ace8-0d46-488a-941c-2325002c5edc-logs\") pod \"e786ace8-0d46-488a-941c-2325002c5edc\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.356164 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e786ace8-0d46-488a-941c-2325002c5edc-config-data\") pod \"e786ace8-0d46-488a-941c-2325002c5edc\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.356193 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c4f4aa8-f25f-4684-8707-3bc0eb168954-horizon-secret-key\") pod \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.356216 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ztlj\" (UniqueName: \"kubernetes.io/projected/e786ace8-0d46-488a-941c-2325002c5edc-kube-api-access-2ztlj\") pod \"e786ace8-0d46-488a-941c-2325002c5edc\" (UID: \"e786ace8-0d46-488a-941c-2325002c5edc\") " Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.356291 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c4f4aa8-f25f-4684-8707-3bc0eb168954-logs\") pod \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\" (UID: \"9c4f4aa8-f25f-4684-8707-3bc0eb168954\") " Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.356599 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e786ace8-0d46-488a-941c-2325002c5edc-logs" (OuterVolumeSpecName: "logs") pod "e786ace8-0d46-488a-941c-2325002c5edc" (UID: "e786ace8-0d46-488a-941c-2325002c5edc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.357059 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e786ace8-0d46-488a-941c-2325002c5edc-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.357400 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c4f4aa8-f25f-4684-8707-3bc0eb168954-logs" (OuterVolumeSpecName: "logs") pod "9c4f4aa8-f25f-4684-8707-3bc0eb168954" (UID: "9c4f4aa8-f25f-4684-8707-3bc0eb168954"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.359863 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e786ace8-0d46-488a-941c-2325002c5edc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e786ace8-0d46-488a-941c-2325002c5edc" (UID: "e786ace8-0d46-488a-941c-2325002c5edc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.362419 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c4f4aa8-f25f-4684-8707-3bc0eb168954-kube-api-access-6nbzs" (OuterVolumeSpecName: "kube-api-access-6nbzs") pod "9c4f4aa8-f25f-4684-8707-3bc0eb168954" (UID: "9c4f4aa8-f25f-4684-8707-3bc0eb168954"). InnerVolumeSpecName "kube-api-access-6nbzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.365051 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c4f4aa8-f25f-4684-8707-3bc0eb168954-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9c4f4aa8-f25f-4684-8707-3bc0eb168954" (UID: "9c4f4aa8-f25f-4684-8707-3bc0eb168954"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.369379 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e786ace8-0d46-488a-941c-2325002c5edc-kube-api-access-2ztlj" (OuterVolumeSpecName: "kube-api-access-2ztlj") pod "e786ace8-0d46-488a-941c-2325002c5edc" (UID: "e786ace8-0d46-488a-941c-2325002c5edc"). InnerVolumeSpecName "kube-api-access-2ztlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.452558 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c4f4aa8-f25f-4684-8707-3bc0eb168954-scripts" (OuterVolumeSpecName: "scripts") pod "9c4f4aa8-f25f-4684-8707-3bc0eb168954" (UID: "9c4f4aa8-f25f-4684-8707-3bc0eb168954"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.453810 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e786ace8-0d46-488a-941c-2325002c5edc-scripts" (OuterVolumeSpecName: "scripts") pod "e786ace8-0d46-488a-941c-2325002c5edc" (UID: "e786ace8-0d46-488a-941c-2325002c5edc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.458042 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e786ace8-0d46-488a-941c-2325002c5edc-config-data" (OuterVolumeSpecName: "config-data") pod "e786ace8-0d46-488a-941c-2325002c5edc" (UID: "e786ace8-0d46-488a-941c-2325002c5edc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.458860 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c4f4aa8-f25f-4684-8707-3bc0eb168954-config-data" (OuterVolumeSpecName: "config-data") pod "9c4f4aa8-f25f-4684-8707-3bc0eb168954" (UID: "9c4f4aa8-f25f-4684-8707-3bc0eb168954"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.460428 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c4f4aa8-f25f-4684-8707-3bc0eb168954-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.460454 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e786ace8-0d46-488a-941c-2325002c5edc-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.460468 4594 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c4f4aa8-f25f-4684-8707-3bc0eb168954-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.460476 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ztlj\" (UniqueName: \"kubernetes.io/projected/e786ace8-0d46-488a-941c-2325002c5edc-kube-api-access-2ztlj\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.460484 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c4f4aa8-f25f-4684-8707-3bc0eb168954-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.460492 4594 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e786ace8-0d46-488a-941c-2325002c5edc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.460499 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c4f4aa8-f25f-4684-8707-3bc0eb168954-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.460506 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e786ace8-0d46-488a-941c-2325002c5edc-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:35 crc kubenswrapper[4594]: I1129 05:45:35.460514 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nbzs\" (UniqueName: \"kubernetes.io/projected/9c4f4aa8-f25f-4684-8707-3bc0eb168954-kube-api-access-6nbzs\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.098520 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac3735b-d8a9-4723-8576-dc16c7af5756" path="/var/lib/kubelet/pods/9ac3735b-d8a9-4723-8576-dc16c7af5756/volumes" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.137840 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4495d878-ec89-4a13-8adb-a64316eb2e68","Type":"ContainerStarted","Data":"78ae5d813f6db587222194c97f2c4f194f98ff9c5031d0f6957d0bbe0e588ec4"} Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.137892 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4495d878-ec89-4a13-8adb-a64316eb2e68","Type":"ContainerStarted","Data":"492da200059560eb48ad284f38e58c77cac031809d5ddc128c26a8551bff3b32"} Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.143139 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-658d8f8767-q5w5m" event={"ID":"e786ace8-0d46-488a-941c-2325002c5edc","Type":"ContainerDied","Data":"9051a9b51962ea17a7205fa4a01dffed74a141ef5047be3a558c4852eb32d254"} Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.143195 4594 scope.go:117] "RemoveContainer" containerID="49db1a73a6aa6d53e47df61ed8cc87575dc878c77038b38da031e4b677a8a131" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.143388 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-658d8f8767-q5w5m" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.147096 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfd076bd-1374-4b36-b5b2-c42f207016fd","Type":"ContainerStarted","Data":"a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada"} Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.147127 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bfd076bd-1374-4b36-b5b2-c42f207016fd" containerName="cinder-api-log" containerID="cri-o://f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af" gracePeriod=30 Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.147203 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.147220 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bfd076bd-1374-4b36-b5b2-c42f207016fd" containerName="cinder-api" containerID="cri-o://a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada" gracePeriod=30 Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.151205 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bf6f9df7f-knsk7" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.151754 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bf6f9df7f-knsk7" event={"ID":"9c4f4aa8-f25f-4684-8707-3bc0eb168954","Type":"ContainerDied","Data":"2a056b1493fe63bc74b73500cafb49fbcff476a094a21907d9cf42bfbd83e8a2"} Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.152058 4594 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.170150 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.026641425 podStartE2EDuration="5.170125946s" podCreationTimestamp="2025-11-29 05:45:31 +0000 UTC" firstStartedPulling="2025-11-29 05:45:33.149483432 +0000 UTC m=+1057.389992651" lastFinishedPulling="2025-11-29 05:45:34.292967952 +0000 UTC m=+1058.533477172" observedRunningTime="2025-11-29 05:45:36.157788216 +0000 UTC m=+1060.398297437" watchObservedRunningTime="2025-11-29 05:45:36.170125946 +0000 UTC m=+1060.410635166" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.172545 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7fb8849f48-9rr52" podUID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.185450 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-658d8f8767-q5w5m"] Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.190312 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-658d8f8767-q5w5m"] Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.190804 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.190787906 podStartE2EDuration="4.190787906s" podCreationTimestamp="2025-11-29 05:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:36.185185526 +0000 UTC m=+1060.425694736" watchObservedRunningTime="2025-11-29 05:45:36.190787906 +0000 UTC m=+1060.431297126" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.205844 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bf6f9df7f-knsk7"] Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.211409 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7bf6f9df7f-knsk7"] Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.316490 4594 scope.go:117] "RemoveContainer" containerID="5ebe1b1187197d5f07df1ccb9f78604659450fccdeb6a5cc9fda337a0746d245" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.437044 4594 scope.go:117] "RemoveContainer" containerID="8d5eb8d12b40086236a654309921e326c5ea36c0cb093bef3e74044402fe4a45" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.562338 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.659578 4594 scope.go:117] "RemoveContainer" containerID="c63f69a44e3afa1d1ac33136814d6b792d5e9273aa32f43e16c6771170da44e1" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.807463 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.902825 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-scripts\") pod \"bfd076bd-1374-4b36-b5b2-c42f207016fd\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.902898 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7bsk\" (UniqueName: \"kubernetes.io/projected/bfd076bd-1374-4b36-b5b2-c42f207016fd-kube-api-access-h7bsk\") pod \"bfd076bd-1374-4b36-b5b2-c42f207016fd\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.903040 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-config-data-custom\") pod \"bfd076bd-1374-4b36-b5b2-c42f207016fd\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.903274 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfd076bd-1374-4b36-b5b2-c42f207016fd-etc-machine-id\") pod \"bfd076bd-1374-4b36-b5b2-c42f207016fd\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.903298 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-config-data\") pod \"bfd076bd-1374-4b36-b5b2-c42f207016fd\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.903337 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd076bd-1374-4b36-b5b2-c42f207016fd-logs\") pod \"bfd076bd-1374-4b36-b5b2-c42f207016fd\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.903364 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-combined-ca-bundle\") pod \"bfd076bd-1374-4b36-b5b2-c42f207016fd\" (UID: \"bfd076bd-1374-4b36-b5b2-c42f207016fd\") " Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.903641 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfd076bd-1374-4b36-b5b2-c42f207016fd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bfd076bd-1374-4b36-b5b2-c42f207016fd" (UID: "bfd076bd-1374-4b36-b5b2-c42f207016fd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.904086 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfd076bd-1374-4b36-b5b2-c42f207016fd-logs" (OuterVolumeSpecName: "logs") pod "bfd076bd-1374-4b36-b5b2-c42f207016fd" (UID: "bfd076bd-1374-4b36-b5b2-c42f207016fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.904178 4594 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfd076bd-1374-4b36-b5b2-c42f207016fd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.915521 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd076bd-1374-4b36-b5b2-c42f207016fd-kube-api-access-h7bsk" (OuterVolumeSpecName: "kube-api-access-h7bsk") pod "bfd076bd-1374-4b36-b5b2-c42f207016fd" (UID: "bfd076bd-1374-4b36-b5b2-c42f207016fd"). InnerVolumeSpecName "kube-api-access-h7bsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.925181 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bfd076bd-1374-4b36-b5b2-c42f207016fd" (UID: "bfd076bd-1374-4b36-b5b2-c42f207016fd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.936368 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-scripts" (OuterVolumeSpecName: "scripts") pod "bfd076bd-1374-4b36-b5b2-c42f207016fd" (UID: "bfd076bd-1374-4b36-b5b2-c42f207016fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.946625 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfd076bd-1374-4b36-b5b2-c42f207016fd" (UID: "bfd076bd-1374-4b36-b5b2-c42f207016fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:36 crc kubenswrapper[4594]: I1129 05:45:36.969328 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-config-data" (OuterVolumeSpecName: "config-data") pod "bfd076bd-1374-4b36-b5b2-c42f207016fd" (UID: "bfd076bd-1374-4b36-b5b2-c42f207016fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.006622 4594 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.006647 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.006656 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd076bd-1374-4b36-b5b2-c42f207016fd-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.006666 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.006674 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd076bd-1374-4b36-b5b2-c42f207016fd-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.006681 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7bsk\" (UniqueName: \"kubernetes.io/projected/bfd076bd-1374-4b36-b5b2-c42f207016fd-kube-api-access-h7bsk\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.168938 4594 generic.go:334] "Generic (PLEG): container finished" podID="dfe84a63-fea5-455e-95d3-523a091b976f" containerID="fc49a19c1a1fc2d60ca494b34631f397817d2181fa37036731550878d86b8601" exitCode=0 Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.169013 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfe84a63-fea5-455e-95d3-523a091b976f","Type":"ContainerDied","Data":"fc49a19c1a1fc2d60ca494b34631f397817d2181fa37036731550878d86b8601"} Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.174818 4594 generic.go:334] "Generic (PLEG): container finished" podID="bfd076bd-1374-4b36-b5b2-c42f207016fd" containerID="a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada" exitCode=0 Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.174852 4594 generic.go:334] "Generic (PLEG): container finished" podID="bfd076bd-1374-4b36-b5b2-c42f207016fd" containerID="f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af" exitCode=143 Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.175848 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.175981 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfd076bd-1374-4b36-b5b2-c42f207016fd","Type":"ContainerDied","Data":"a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada"} Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.176017 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfd076bd-1374-4b36-b5b2-c42f207016fd","Type":"ContainerDied","Data":"f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af"} Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.176029 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfd076bd-1374-4b36-b5b2-c42f207016fd","Type":"ContainerDied","Data":"bca4d4700aa8114c3ccb2e550fccdea3a39fe5f4c475d60986f8dbf1bd03d11d"} Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.176046 4594 scope.go:117] "RemoveContainer" containerID="a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.202853 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.204813 4594 scope.go:117] "RemoveContainer" containerID="f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.211766 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfe84a63-fea5-455e-95d3-523a091b976f-run-httpd\") pod \"dfe84a63-fea5-455e-95d3-523a091b976f\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.211916 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfe84a63-fea5-455e-95d3-523a091b976f-log-httpd\") pod \"dfe84a63-fea5-455e-95d3-523a091b976f\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.211957 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-config-data\") pod \"dfe84a63-fea5-455e-95d3-523a091b976f\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.212013 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnlgg\" (UniqueName: \"kubernetes.io/projected/dfe84a63-fea5-455e-95d3-523a091b976f-kube-api-access-fnlgg\") pod \"dfe84a63-fea5-455e-95d3-523a091b976f\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.212104 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-sg-core-conf-yaml\") pod \"dfe84a63-fea5-455e-95d3-523a091b976f\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.212354 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-scripts\") pod \"dfe84a63-fea5-455e-95d3-523a091b976f\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.212424 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-combined-ca-bundle\") pod \"dfe84a63-fea5-455e-95d3-523a091b976f\" (UID: \"dfe84a63-fea5-455e-95d3-523a091b976f\") " Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.212874 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfe84a63-fea5-455e-95d3-523a091b976f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dfe84a63-fea5-455e-95d3-523a091b976f" (UID: "dfe84a63-fea5-455e-95d3-523a091b976f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.213487 4594 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfe84a63-fea5-455e-95d3-523a091b976f-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.216455 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.216863 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfe84a63-fea5-455e-95d3-523a091b976f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dfe84a63-fea5-455e-95d3-523a091b976f" (UID: "dfe84a63-fea5-455e-95d3-523a091b976f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.220627 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-scripts" (OuterVolumeSpecName: "scripts") pod "dfe84a63-fea5-455e-95d3-523a091b976f" (UID: "dfe84a63-fea5-455e-95d3-523a091b976f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.223795 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.226424 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe84a63-fea5-455e-95d3-523a091b976f-kube-api-access-fnlgg" (OuterVolumeSpecName: "kube-api-access-fnlgg") pod "dfe84a63-fea5-455e-95d3-523a091b976f" (UID: "dfe84a63-fea5-455e-95d3-523a091b976f"). InnerVolumeSpecName "kube-api-access-fnlgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.247049 4594 scope.go:117] "RemoveContainer" containerID="a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.252055 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada\": container with ID starting with a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada not found: ID does not exist" containerID="a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.252088 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada"} err="failed to get container status \"a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada\": rpc error: code = NotFound desc = could not find container \"a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada\": container with ID starting with a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada not found: ID does not exist" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.252115 4594 scope.go:117] "RemoveContainer" containerID="f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.252456 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af\": container with ID starting with f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af not found: ID does not exist" containerID="f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.252499 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af"} err="failed to get container status \"f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af\": rpc error: code = NotFound desc = could not find container \"f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af\": container with ID starting with f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af not found: ID does not exist" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.252531 4594 scope.go:117] "RemoveContainer" containerID="a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.252797 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada"} err="failed to get container status \"a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada\": rpc error: code = NotFound desc = could not find container \"a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada\": container with ID starting with a689963aed8564e1401a62eba89551a40db462c2990651b0a003fbde80133ada not found: ID does not exist" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.252819 4594 scope.go:117] "RemoveContainer" containerID="f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.253048 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af"} err="failed to get container status \"f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af\": rpc error: code = NotFound desc = could not find container \"f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af\": container with ID starting with f8d28ad80586995020796c297aa08b4673d30ac4c40bcad5281144b1c41459af not found: ID does not exist" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.257404 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dfe84a63-fea5-455e-95d3-523a091b976f" (UID: "dfe84a63-fea5-455e-95d3-523a091b976f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294242 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294744 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd076bd-1374-4b36-b5b2-c42f207016fd" containerName="cinder-api-log" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294764 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd076bd-1374-4b36-b5b2-c42f207016fd" containerName="cinder-api-log" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294776 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e786ace8-0d46-488a-941c-2325002c5edc" containerName="horizon" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294782 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e786ace8-0d46-488a-941c-2325002c5edc" containerName="horizon" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294792 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="sg-core" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294798 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="sg-core" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294817 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd076bd-1374-4b36-b5b2-c42f207016fd" containerName="cinder-api" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294822 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd076bd-1374-4b36-b5b2-c42f207016fd" containerName="cinder-api" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294836 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="ceilometer-central-agent" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294841 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="ceilometer-central-agent" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294850 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="proxy-httpd" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294856 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="proxy-httpd" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294866 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfcba285-f64f-471e-a2eb-4b38b2a35b3c" containerName="barbican-api" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294873 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfcba285-f64f-471e-a2eb-4b38b2a35b3c" containerName="barbican-api" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294890 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac3735b-d8a9-4723-8576-dc16c7af5756" containerName="horizon" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294895 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac3735b-d8a9-4723-8576-dc16c7af5756" containerName="horizon" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294911 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac3735b-d8a9-4723-8576-dc16c7af5756" containerName="horizon-log" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294916 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac3735b-d8a9-4723-8576-dc16c7af5756" containerName="horizon-log" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294929 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e786ace8-0d46-488a-941c-2325002c5edc" containerName="horizon-log" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294935 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e786ace8-0d46-488a-941c-2325002c5edc" containerName="horizon-log" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294946 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4f4aa8-f25f-4684-8707-3bc0eb168954" containerName="horizon-log" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294951 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4f4aa8-f25f-4684-8707-3bc0eb168954" containerName="horizon-log" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294958 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="ceilometer-notification-agent" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294964 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="ceilometer-notification-agent" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294976 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfcba285-f64f-471e-a2eb-4b38b2a35b3c" containerName="barbican-api-log" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294981 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfcba285-f64f-471e-a2eb-4b38b2a35b3c" containerName="barbican-api-log" Nov 29 05:45:37 crc kubenswrapper[4594]: E1129 05:45:37.294989 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4f4aa8-f25f-4684-8707-3bc0eb168954" containerName="horizon" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.294995 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4f4aa8-f25f-4684-8707-3bc0eb168954" containerName="horizon" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295177 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="ceilometer-central-agent" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295185 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfcba285-f64f-471e-a2eb-4b38b2a35b3c" containerName="barbican-api-log" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295195 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e786ace8-0d46-488a-941c-2325002c5edc" containerName="horizon-log" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295204 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd076bd-1374-4b36-b5b2-c42f207016fd" containerName="cinder-api-log" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295216 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e786ace8-0d46-488a-941c-2325002c5edc" containerName="horizon" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295224 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="proxy-httpd" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295232 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="ceilometer-notification-agent" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295244 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd076bd-1374-4b36-b5b2-c42f207016fd" containerName="cinder-api" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295268 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c4f4aa8-f25f-4684-8707-3bc0eb168954" containerName="horizon-log" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295283 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" containerName="sg-core" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295296 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac3735b-d8a9-4723-8576-dc16c7af5756" containerName="horizon-log" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295303 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c4f4aa8-f25f-4684-8707-3bc0eb168954" containerName="horizon" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295308 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac3735b-d8a9-4723-8576-dc16c7af5756" containerName="horizon" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.295318 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfcba285-f64f-471e-a2eb-4b38b2a35b3c" containerName="barbican-api" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.296382 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.299988 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.300520 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.300565 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.300523 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.313336 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfe84a63-fea5-455e-95d3-523a091b976f" (UID: "dfe84a63-fea5-455e-95d3-523a091b976f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.318573 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.320666 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-logs\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.320734 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.320807 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-config-data\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.321013 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.321050 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5jwj\" (UniqueName: \"kubernetes.io/projected/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-kube-api-access-f5jwj\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.321222 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-config-data-custom\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.321351 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-scripts\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.321401 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.321642 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.321655 4594 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfe84a63-fea5-455e-95d3-523a091b976f-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.321665 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnlgg\" (UniqueName: \"kubernetes.io/projected/dfe84a63-fea5-455e-95d3-523a091b976f-kube-api-access-fnlgg\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.321679 4594 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.321691 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.335392 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.362807 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-config-data" (OuterVolumeSpecName: "config-data") pod "dfe84a63-fea5-455e-95d3-523a091b976f" (UID: "dfe84a63-fea5-455e-95d3-523a091b976f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.423768 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.423832 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5jwj\" (UniqueName: \"kubernetes.io/projected/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-kube-api-access-f5jwj\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.423943 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-config-data-custom\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.424077 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-scripts\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.424131 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.424177 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.424220 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-logs\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.424299 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.424339 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-config-data\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.424370 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.424409 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe84a63-fea5-455e-95d3-523a091b976f-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.424719 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-logs\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.428041 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-config-data-custom\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.428117 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.429497 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.429562 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-scripts\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.431881 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.432174 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-config-data\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.439698 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5jwj\" (UniqueName: \"kubernetes.io/projected/d704d9f4-1a8a-4cc8-af37-371bcc9b254b-kube-api-access-f5jwj\") pod \"cinder-api-0\" (UID: \"d704d9f4-1a8a-4cc8-af37-371bcc9b254b\") " pod="openstack/cinder-api-0" Nov 29 05:45:37 crc kubenswrapper[4594]: I1129 05:45:37.629433 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.073122 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 05:45:38 crc kubenswrapper[4594]: W1129 05:45:38.083838 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd704d9f4_1a8a_4cc8_af37_371bcc9b254b.slice/crio-98acc44e2a883fb29cf1bbc97c63aae2e4d90197881f10c9c7b8bdb28ee23ef9 WatchSource:0}: Error finding container 98acc44e2a883fb29cf1bbc97c63aae2e4d90197881f10c9c7b8bdb28ee23ef9: Status 404 returned error can't find the container with id 98acc44e2a883fb29cf1bbc97c63aae2e4d90197881f10c9c7b8bdb28ee23ef9 Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.092405 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c4f4aa8-f25f-4684-8707-3bc0eb168954" path="/var/lib/kubelet/pods/9c4f4aa8-f25f-4684-8707-3bc0eb168954/volumes" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.093207 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd076bd-1374-4b36-b5b2-c42f207016fd" path="/var/lib/kubelet/pods/bfd076bd-1374-4b36-b5b2-c42f207016fd/volumes" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.093937 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e786ace8-0d46-488a-941c-2325002c5edc" path="/var/lib/kubelet/pods/e786ace8-0d46-488a-941c-2325002c5edc/volumes" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.202945 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d704d9f4-1a8a-4cc8-af37-371bcc9b254b","Type":"ContainerStarted","Data":"98acc44e2a883fb29cf1bbc97c63aae2e4d90197881f10c9c7b8bdb28ee23ef9"} Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.211572 4594 generic.go:334] "Generic (PLEG): container finished" podID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerID="327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5" exitCode=1 Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.211637 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d60aa94-3720-43b5-a5de-ca30dd1b63b2","Type":"ContainerDied","Data":"327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5"} Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.211699 4594 scope.go:117] "RemoveContainer" containerID="ce07c65467f1a09735ac28b33a41432403765fe428167c2bcd92e7ee2bfe40ad" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.212602 4594 scope.go:117] "RemoveContainer" containerID="327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5" Nov 29 05:45:38 crc kubenswrapper[4594]: E1129 05:45:38.213069 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3d60aa94-3720-43b5-a5de-ca30dd1b63b2)\"" pod="openstack/watcher-decision-engine-0" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.227345 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.227635 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfe84a63-fea5-455e-95d3-523a091b976f","Type":"ContainerDied","Data":"9bc0dd9c240581d8a29dee5bc3e11733a20ba6a528c2347a613cb60b099e44e6"} Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.277940 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.284546 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.295899 4594 scope.go:117] "RemoveContainer" containerID="d16c7dfef88944e5affe7fb57fc6f7a7ff275273002c78196b3f3b7c77545b96" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.300628 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.303528 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.305660 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.305885 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.309443 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.344473 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef00ff38-96ae-42ba-8c5f-05c73ec60467-log-httpd\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.344753 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef00ff38-96ae-42ba-8c5f-05c73ec60467-run-httpd\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.344799 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-config-data\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.345034 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvhsq\" (UniqueName: \"kubernetes.io/projected/ef00ff38-96ae-42ba-8c5f-05c73ec60467-kube-api-access-rvhsq\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.345097 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-scripts\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.345139 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.345354 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.353418 4594 scope.go:117] "RemoveContainer" containerID="82a7be62d336acb1c166e710c849a936fe0c5c2dc0533148ba8167ea8bbc75b0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.387083 4594 scope.go:117] "RemoveContainer" containerID="fc49a19c1a1fc2d60ca494b34631f397817d2181fa37036731550878d86b8601" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.413414 4594 scope.go:117] "RemoveContainer" containerID="8b910ba656191f153188fed85a2fe961876dddee8993298e48bca1621bd41c04" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.447822 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvhsq\" (UniqueName: \"kubernetes.io/projected/ef00ff38-96ae-42ba-8c5f-05c73ec60467-kube-api-access-rvhsq\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.447867 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-scripts\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.447891 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.447940 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.447995 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef00ff38-96ae-42ba-8c5f-05c73ec60467-log-httpd\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.448065 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef00ff38-96ae-42ba-8c5f-05c73ec60467-run-httpd\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.448088 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-config-data\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.449050 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef00ff38-96ae-42ba-8c5f-05c73ec60467-log-httpd\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.464993 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-scripts\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.465463 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef00ff38-96ae-42ba-8c5f-05c73ec60467-run-httpd\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.469123 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-config-data\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.472946 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.477878 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.512890 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvhsq\" (UniqueName: \"kubernetes.io/projected/ef00ff38-96ae-42ba-8c5f-05c73ec60467-kube-api-access-rvhsq\") pod \"ceilometer-0\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " pod="openstack/ceilometer-0" Nov 29 05:45:38 crc kubenswrapper[4594]: I1129 05:45:38.635900 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:45:39 crc kubenswrapper[4594]: I1129 05:45:39.096711 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:45:39 crc kubenswrapper[4594]: W1129 05:45:39.102347 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef00ff38_96ae_42ba_8c5f_05c73ec60467.slice/crio-cfffcbedf88fdaa485b1309f5b941f1aeb5e264d8fb7165246f8b0f80f423f1e WatchSource:0}: Error finding container cfffcbedf88fdaa485b1309f5b941f1aeb5e264d8fb7165246f8b0f80f423f1e: Status 404 returned error can't find the container with id cfffcbedf88fdaa485b1309f5b941f1aeb5e264d8fb7165246f8b0f80f423f1e Nov 29 05:45:39 crc kubenswrapper[4594]: I1129 05:45:39.243185 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d704d9f4-1a8a-4cc8-af37-371bcc9b254b","Type":"ContainerStarted","Data":"a96331ac5231607017943f69e4749c25b7c07a0e7b06655fb6351d310612ee1d"} Nov 29 05:45:39 crc kubenswrapper[4594]: I1129 05:45:39.248591 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef00ff38-96ae-42ba-8c5f-05c73ec60467","Type":"ContainerStarted","Data":"cfffcbedf88fdaa485b1309f5b941f1aeb5e264d8fb7165246f8b0f80f423f1e"} Nov 29 05:45:40 crc kubenswrapper[4594]: I1129 05:45:40.106199 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe84a63-fea5-455e-95d3-523a091b976f" path="/var/lib/kubelet/pods/dfe84a63-fea5-455e-95d3-523a091b976f/volumes" Nov 29 05:45:40 crc kubenswrapper[4594]: I1129 05:45:40.263766 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef00ff38-96ae-42ba-8c5f-05c73ec60467","Type":"ContainerStarted","Data":"91ef0b975213ecd49a4cb8ca0772a78a05fca16694fee4cbd1819c22d2897038"} Nov 29 05:45:40 crc kubenswrapper[4594]: I1129 05:45:40.268460 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d704d9f4-1a8a-4cc8-af37-371bcc9b254b","Type":"ContainerStarted","Data":"4ba545ce325392f1bd2c555eb3ab62abf012cb90571636dc149aea6e610f4cff"} Nov 29 05:45:40 crc kubenswrapper[4594]: I1129 05:45:40.268671 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 29 05:45:40 crc kubenswrapper[4594]: I1129 05:45:40.297283 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.297269627 podStartE2EDuration="3.297269627s" podCreationTimestamp="2025-11-29 05:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:40.282773557 +0000 UTC m=+1064.523282777" watchObservedRunningTime="2025-11-29 05:45:40.297269627 +0000 UTC m=+1064.537778848" Nov 29 05:45:41 crc kubenswrapper[4594]: I1129 05:45:41.290877 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef00ff38-96ae-42ba-8c5f-05c73ec60467","Type":"ContainerStarted","Data":"2a50d6f15b24a6202f9345ab29196faeb824eab5b07fca25dbb884c0c2ae9572"} Nov 29 05:45:42 crc kubenswrapper[4594]: I1129 05:45:42.302542 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef00ff38-96ae-42ba-8c5f-05c73ec60467","Type":"ContainerStarted","Data":"3922cd998fc7437f66c830a0c2b287e9cf0c2540e2ea914865cd3ec535ffb488"} Nov 29 05:45:42 crc kubenswrapper[4594]: I1129 05:45:42.460219 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 29 05:45:42 crc kubenswrapper[4594]: I1129 05:45:42.520444 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 05:45:42 crc kubenswrapper[4594]: I1129 05:45:42.725509 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:45:42 crc kubenswrapper[4594]: I1129 05:45:42.831639 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c68846bf-7r6g7"] Nov 29 05:45:42 crc kubenswrapper[4594]: I1129 05:45:42.831878 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" podUID="f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" containerName="dnsmasq-dns" containerID="cri-o://7d06ce876e418b7180d721b44fd78e07afeaee58f9f0487f63e6e2be9db0060b" gracePeriod=10 Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.320473 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef00ff38-96ae-42ba-8c5f-05c73ec60467","Type":"ContainerStarted","Data":"6a42b35d40824084353b7d5e1e21a712c35179bbaf94ceba9ed648d6e2afc1f2"} Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.320986 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.323226 4594 generic.go:334] "Generic (PLEG): container finished" podID="f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" containerID="7d06ce876e418b7180d721b44fd78e07afeaee58f9f0487f63e6e2be9db0060b" exitCode=0 Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.323316 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" event={"ID":"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c","Type":"ContainerDied","Data":"7d06ce876e418b7180d721b44fd78e07afeaee58f9f0487f63e6e2be9db0060b"} Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.323857 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4495d878-ec89-4a13-8adb-a64316eb2e68" containerName="cinder-scheduler" containerID="cri-o://492da200059560eb48ad284f38e58c77cac031809d5ddc128c26a8551bff3b32" gracePeriod=30 Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.323935 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4495d878-ec89-4a13-8adb-a64316eb2e68" containerName="probe" containerID="cri-o://78ae5d813f6db587222194c97f2c4f194f98ff9c5031d0f6957d0bbe0e588ec4" gracePeriod=30 Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.351571 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.393004333 podStartE2EDuration="5.351554512s" podCreationTimestamp="2025-11-29 05:45:38 +0000 UTC" firstStartedPulling="2025-11-29 05:45:39.106368856 +0000 UTC m=+1063.346878067" lastFinishedPulling="2025-11-29 05:45:43.064919026 +0000 UTC m=+1067.305428246" observedRunningTime="2025-11-29 05:45:43.340599052 +0000 UTC m=+1067.581108272" watchObservedRunningTime="2025-11-29 05:45:43.351554512 +0000 UTC m=+1067.592063732" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.757197 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.890407 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-ovsdbserver-nb\") pod \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.890566 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-config\") pod \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.890599 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-dns-svc\") pod \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.890671 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldrzc\" (UniqueName: \"kubernetes.io/projected/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-kube-api-access-ldrzc\") pod \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.890741 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-ovsdbserver-sb\") pod \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.890870 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-dns-swift-storage-0\") pod \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\" (UID: \"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c\") " Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.895490 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-kube-api-access-ldrzc" (OuterVolumeSpecName: "kube-api-access-ldrzc") pod "f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" (UID: "f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c"). InnerVolumeSpecName "kube-api-access-ldrzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.931141 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" (UID: "f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.933140 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" (UID: "f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.938860 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" (UID: "f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.942506 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-config" (OuterVolumeSpecName: "config") pod "f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" (UID: "f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.953076 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" (UID: "f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.994405 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.994443 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.994455 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldrzc\" (UniqueName: \"kubernetes.io/projected/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-kube-api-access-ldrzc\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.994472 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.994482 4594 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:43 crc kubenswrapper[4594]: I1129 05:45:43.994493 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:44 crc kubenswrapper[4594]: I1129 05:45:44.073518 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Nov 29 05:45:44 crc kubenswrapper[4594]: I1129 05:45:44.081029 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Nov 29 05:45:44 crc kubenswrapper[4594]: I1129 05:45:44.325526 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Nov 29 05:45:44 crc kubenswrapper[4594]: I1129 05:45:44.326004 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Nov 29 05:45:44 crc kubenswrapper[4594]: I1129 05:45:44.330132 4594 scope.go:117] "RemoveContainer" containerID="327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5" Nov 29 05:45:44 crc kubenswrapper[4594]: E1129 05:45:44.330747 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3d60aa94-3720-43b5-a5de-ca30dd1b63b2)\"" pod="openstack/watcher-decision-engine-0" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" Nov 29 05:45:44 crc kubenswrapper[4594]: I1129 05:45:44.336563 4594 generic.go:334] "Generic (PLEG): container finished" podID="4495d878-ec89-4a13-8adb-a64316eb2e68" containerID="78ae5d813f6db587222194c97f2c4f194f98ff9c5031d0f6957d0bbe0e588ec4" exitCode=0 Nov 29 05:45:44 crc kubenswrapper[4594]: I1129 05:45:44.336680 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4495d878-ec89-4a13-8adb-a64316eb2e68","Type":"ContainerDied","Data":"78ae5d813f6db587222194c97f2c4f194f98ff9c5031d0f6957d0bbe0e588ec4"} Nov 29 05:45:44 crc kubenswrapper[4594]: I1129 05:45:44.339328 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" event={"ID":"f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c","Type":"ContainerDied","Data":"265bbd42514568578cd8bec846db87cb8b10bf0cec7b9cfa3974714a4a640859"} Nov 29 05:45:44 crc kubenswrapper[4594]: I1129 05:45:44.339400 4594 scope.go:117] "RemoveContainer" containerID="7d06ce876e418b7180d721b44fd78e07afeaee58f9f0487f63e6e2be9db0060b" Nov 29 05:45:44 crc kubenswrapper[4594]: I1129 05:45:44.339396 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c68846bf-7r6g7" Nov 29 05:45:44 crc kubenswrapper[4594]: I1129 05:45:44.365358 4594 scope.go:117] "RemoveContainer" containerID="e7bf409814f934d44c93eea45caa6ad5fa41e5fc89de103903bf0095a0e95498" Nov 29 05:45:44 crc kubenswrapper[4594]: I1129 05:45:44.370247 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c68846bf-7r6g7"] Nov 29 05:45:44 crc kubenswrapper[4594]: I1129 05:45:44.377385 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84c68846bf-7r6g7"] Nov 29 05:45:45 crc kubenswrapper[4594]: I1129 05:45:45.800622 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:45:45 crc kubenswrapper[4594]: I1129 05:45:45.800942 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:45:45 crc kubenswrapper[4594]: I1129 05:45:45.800995 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:45:45 crc kubenswrapper[4594]: I1129 05:45:45.801702 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f65325fd45322d2eca6d68603b2fa4746238184f283f293edabcb7b19e64c595"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 05:45:45 crc kubenswrapper[4594]: I1129 05:45:45.801768 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://f65325fd45322d2eca6d68603b2fa4746238184f283f293edabcb7b19e64c595" gracePeriod=600 Nov 29 05:45:46 crc kubenswrapper[4594]: I1129 05:45:46.095509 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" path="/var/lib/kubelet/pods/f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c/volumes" Nov 29 05:45:46 crc kubenswrapper[4594]: I1129 05:45:46.168644 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7fb8849f48-9rr52" podUID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Nov 29 05:45:46 crc kubenswrapper[4594]: I1129 05:45:46.370901 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="f65325fd45322d2eca6d68603b2fa4746238184f283f293edabcb7b19e64c595" exitCode=0 Nov 29 05:45:46 crc kubenswrapper[4594]: I1129 05:45:46.370950 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"f65325fd45322d2eca6d68603b2fa4746238184f283f293edabcb7b19e64c595"} Nov 29 05:45:46 crc kubenswrapper[4594]: I1129 05:45:46.370979 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"8eaecd3b285f1b345905df03e8392cda698400ba526ee824406a745a137450c8"} Nov 29 05:45:46 crc kubenswrapper[4594]: I1129 05:45:46.370995 4594 scope.go:117] "RemoveContainer" containerID="3d6477f346180f42d9b7737089f66087fbd7adf1dc0feaf01168195b349570d2" Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.770627 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.877037 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-scripts\") pod \"4495d878-ec89-4a13-8adb-a64316eb2e68\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.877102 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4495d878-ec89-4a13-8adb-a64316eb2e68-etc-machine-id\") pod \"4495d878-ec89-4a13-8adb-a64316eb2e68\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.877198 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl75p\" (UniqueName: \"kubernetes.io/projected/4495d878-ec89-4a13-8adb-a64316eb2e68-kube-api-access-kl75p\") pod \"4495d878-ec89-4a13-8adb-a64316eb2e68\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.877282 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-combined-ca-bundle\") pod \"4495d878-ec89-4a13-8adb-a64316eb2e68\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.877376 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-config-data\") pod \"4495d878-ec89-4a13-8adb-a64316eb2e68\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.877414 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4495d878-ec89-4a13-8adb-a64316eb2e68-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4495d878-ec89-4a13-8adb-a64316eb2e68" (UID: "4495d878-ec89-4a13-8adb-a64316eb2e68"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.877575 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-config-data-custom\") pod \"4495d878-ec89-4a13-8adb-a64316eb2e68\" (UID: \"4495d878-ec89-4a13-8adb-a64316eb2e68\") " Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.878307 4594 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4495d878-ec89-4a13-8adb-a64316eb2e68-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.883089 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4495d878-ec89-4a13-8adb-a64316eb2e68-kube-api-access-kl75p" (OuterVolumeSpecName: "kube-api-access-kl75p") pod "4495d878-ec89-4a13-8adb-a64316eb2e68" (UID: "4495d878-ec89-4a13-8adb-a64316eb2e68"). InnerVolumeSpecName "kube-api-access-kl75p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.883354 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-scripts" (OuterVolumeSpecName: "scripts") pod "4495d878-ec89-4a13-8adb-a64316eb2e68" (UID: "4495d878-ec89-4a13-8adb-a64316eb2e68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.884415 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4495d878-ec89-4a13-8adb-a64316eb2e68" (UID: "4495d878-ec89-4a13-8adb-a64316eb2e68"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.923970 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4495d878-ec89-4a13-8adb-a64316eb2e68" (UID: "4495d878-ec89-4a13-8adb-a64316eb2e68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.957193 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-config-data" (OuterVolumeSpecName: "config-data") pod "4495d878-ec89-4a13-8adb-a64316eb2e68" (UID: "4495d878-ec89-4a13-8adb-a64316eb2e68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.979912 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.979943 4594 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.979956 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.979965 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl75p\" (UniqueName: \"kubernetes.io/projected/4495d878-ec89-4a13-8adb-a64316eb2e68-kube-api-access-kl75p\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:47 crc kubenswrapper[4594]: I1129 05:45:47.979978 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4495d878-ec89-4a13-8adb-a64316eb2e68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.392869 4594 generic.go:334] "Generic (PLEG): container finished" podID="4495d878-ec89-4a13-8adb-a64316eb2e68" containerID="492da200059560eb48ad284f38e58c77cac031809d5ddc128c26a8551bff3b32" exitCode=0 Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.392934 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.392956 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4495d878-ec89-4a13-8adb-a64316eb2e68","Type":"ContainerDied","Data":"492da200059560eb48ad284f38e58c77cac031809d5ddc128c26a8551bff3b32"} Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.394134 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4495d878-ec89-4a13-8adb-a64316eb2e68","Type":"ContainerDied","Data":"a4d3f97ffd84f2678df3d61a8e24a321e36011636bb5c1bd856ce0a244bf79e6"} Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.394160 4594 scope.go:117] "RemoveContainer" containerID="78ae5d813f6db587222194c97f2c4f194f98ff9c5031d0f6957d0bbe0e588ec4" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.416998 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.417020 4594 scope.go:117] "RemoveContainer" containerID="492da200059560eb48ad284f38e58c77cac031809d5ddc128c26a8551bff3b32" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.433708 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.441092 4594 scope.go:117] "RemoveContainer" containerID="78ae5d813f6db587222194c97f2c4f194f98ff9c5031d0f6957d0bbe0e588ec4" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.442712 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 05:45:48 crc kubenswrapper[4594]: E1129 05:45:48.443154 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" containerName="init" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.443175 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" containerName="init" Nov 29 05:45:48 crc kubenswrapper[4594]: E1129 05:45:48.443201 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4495d878-ec89-4a13-8adb-a64316eb2e68" containerName="probe" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.443207 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="4495d878-ec89-4a13-8adb-a64316eb2e68" containerName="probe" Nov 29 05:45:48 crc kubenswrapper[4594]: E1129 05:45:48.443221 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" containerName="dnsmasq-dns" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.443227 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" containerName="dnsmasq-dns" Nov 29 05:45:48 crc kubenswrapper[4594]: E1129 05:45:48.443268 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4495d878-ec89-4a13-8adb-a64316eb2e68" containerName="cinder-scheduler" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.443275 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="4495d878-ec89-4a13-8adb-a64316eb2e68" containerName="cinder-scheduler" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.443437 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="4495d878-ec89-4a13-8adb-a64316eb2e68" containerName="probe" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.443456 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3699cc4-7ae6-459c-b7a6-a2e8ccb1490c" containerName="dnsmasq-dns" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.443476 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="4495d878-ec89-4a13-8adb-a64316eb2e68" containerName="cinder-scheduler" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.444549 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: E1129 05:45:48.444867 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78ae5d813f6db587222194c97f2c4f194f98ff9c5031d0f6957d0bbe0e588ec4\": container with ID starting with 78ae5d813f6db587222194c97f2c4f194f98ff9c5031d0f6957d0bbe0e588ec4 not found: ID does not exist" containerID="78ae5d813f6db587222194c97f2c4f194f98ff9c5031d0f6957d0bbe0e588ec4" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.444927 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78ae5d813f6db587222194c97f2c4f194f98ff9c5031d0f6957d0bbe0e588ec4"} err="failed to get container status \"78ae5d813f6db587222194c97f2c4f194f98ff9c5031d0f6957d0bbe0e588ec4\": rpc error: code = NotFound desc = could not find container \"78ae5d813f6db587222194c97f2c4f194f98ff9c5031d0f6957d0bbe0e588ec4\": container with ID starting with 78ae5d813f6db587222194c97f2c4f194f98ff9c5031d0f6957d0bbe0e588ec4 not found: ID does not exist" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.444963 4594 scope.go:117] "RemoveContainer" containerID="492da200059560eb48ad284f38e58c77cac031809d5ddc128c26a8551bff3b32" Nov 29 05:45:48 crc kubenswrapper[4594]: E1129 05:45:48.445902 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"492da200059560eb48ad284f38e58c77cac031809d5ddc128c26a8551bff3b32\": container with ID starting with 492da200059560eb48ad284f38e58c77cac031809d5ddc128c26a8551bff3b32 not found: ID does not exist" containerID="492da200059560eb48ad284f38e58c77cac031809d5ddc128c26a8551bff3b32" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.445937 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492da200059560eb48ad284f38e58c77cac031809d5ddc128c26a8551bff3b32"} err="failed to get container status \"492da200059560eb48ad284f38e58c77cac031809d5ddc128c26a8551bff3b32\": rpc error: code = NotFound desc = could not find container \"492da200059560eb48ad284f38e58c77cac031809d5ddc128c26a8551bff3b32\": container with ID starting with 492da200059560eb48ad284f38e58c77cac031809d5ddc128c26a8551bff3b32 not found: ID does not exist" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.446815 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.452222 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.456998 4594 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podd7cc6b30-a981-4486-9d8a-e926167f001b"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podd7cc6b30-a981-4486-9d8a-e926167f001b] : Timed out while waiting for systemd to remove kubepods-besteffort-podd7cc6b30_a981_4486_9d8a_e926167f001b.slice" Nov 29 05:45:48 crc kubenswrapper[4594]: E1129 05:45:48.457042 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podd7cc6b30-a981-4486-9d8a-e926167f001b] : unable to destroy cgroup paths for cgroup [kubepods besteffort podd7cc6b30-a981-4486-9d8a-e926167f001b] : Timed out while waiting for systemd to remove kubepods-besteffort-podd7cc6b30_a981_4486_9d8a_e926167f001b.slice" pod="openstack/keystone-bootstrap-gkrjf" podUID="d7cc6b30-a981-4486-9d8a-e926167f001b" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.591812 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a816de0b-732c-46f3-ba52-2a7630623d5b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.591871 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a816de0b-732c-46f3-ba52-2a7630623d5b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.592177 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a816de0b-732c-46f3-ba52-2a7630623d5b-scripts\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.592246 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a816de0b-732c-46f3-ba52-2a7630623d5b-config-data\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.592639 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzbc2\" (UniqueName: \"kubernetes.io/projected/a816de0b-732c-46f3-ba52-2a7630623d5b-kube-api-access-pzbc2\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.592730 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a816de0b-732c-46f3-ba52-2a7630623d5b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.694981 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a816de0b-732c-46f3-ba52-2a7630623d5b-scripts\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.695045 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a816de0b-732c-46f3-ba52-2a7630623d5b-config-data\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.695161 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzbc2\" (UniqueName: \"kubernetes.io/projected/a816de0b-732c-46f3-ba52-2a7630623d5b-kube-api-access-pzbc2\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.695195 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a816de0b-732c-46f3-ba52-2a7630623d5b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.695355 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a816de0b-732c-46f3-ba52-2a7630623d5b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.695424 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a816de0b-732c-46f3-ba52-2a7630623d5b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.695905 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a816de0b-732c-46f3-ba52-2a7630623d5b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.701668 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a816de0b-732c-46f3-ba52-2a7630623d5b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.703718 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a816de0b-732c-46f3-ba52-2a7630623d5b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.704594 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a816de0b-732c-46f3-ba52-2a7630623d5b-config-data\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.726864 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzbc2\" (UniqueName: \"kubernetes.io/projected/a816de0b-732c-46f3-ba52-2a7630623d5b-kube-api-access-pzbc2\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.727716 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a816de0b-732c-46f3-ba52-2a7630623d5b-scripts\") pod \"cinder-scheduler-0\" (UID: \"a816de0b-732c-46f3-ba52-2a7630623d5b\") " pod="openstack/cinder-scheduler-0" Nov 29 05:45:48 crc kubenswrapper[4594]: I1129 05:45:48.768797 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 05:45:49 crc kubenswrapper[4594]: W1129 05:45:49.205151 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda816de0b_732c_46f3_ba52_2a7630623d5b.slice/crio-1d917ff234def3568c61af84d4d25dcbd5811ca414d633321bedce0a95c4cee7 WatchSource:0}: Error finding container 1d917ff234def3568c61af84d4d25dcbd5811ca414d633321bedce0a95c4cee7: Status 404 returned error can't find the container with id 1d917ff234def3568c61af84d4d25dcbd5811ca414d633321bedce0a95c4cee7 Nov 29 05:45:49 crc kubenswrapper[4594]: I1129 05:45:49.206812 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 05:45:49 crc kubenswrapper[4594]: I1129 05:45:49.389976 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 29 05:45:49 crc kubenswrapper[4594]: I1129 05:45:49.429247 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gkrjf" Nov 29 05:45:49 crc kubenswrapper[4594]: I1129 05:45:49.429243 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a816de0b-732c-46f3-ba52-2a7630623d5b","Type":"ContainerStarted","Data":"1d917ff234def3568c61af84d4d25dcbd5811ca414d633321bedce0a95c4cee7"} Nov 29 05:45:50 crc kubenswrapper[4594]: I1129 05:45:50.097090 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4495d878-ec89-4a13-8adb-a64316eb2e68" path="/var/lib/kubelet/pods/4495d878-ec89-4a13-8adb-a64316eb2e68/volumes" Nov 29 05:45:50 crc kubenswrapper[4594]: I1129 05:45:50.451476 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a816de0b-732c-46f3-ba52-2a7630623d5b","Type":"ContainerStarted","Data":"50ce564d65e0af0371b4d2a7894c65797109849175d29d1d69038a9a50337772"} Nov 29 05:45:50 crc kubenswrapper[4594]: I1129 05:45:50.451827 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a816de0b-732c-46f3-ba52-2a7630623d5b","Type":"ContainerStarted","Data":"49c5e602aaf036607b3aadd67f624b7693c113d0460b0d34be392121a4182522"} Nov 29 05:45:50 crc kubenswrapper[4594]: I1129 05:45:50.479215 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.479183698 podStartE2EDuration="2.479183698s" podCreationTimestamp="2025-11-29 05:45:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:50.472533427 +0000 UTC m=+1074.713042657" watchObservedRunningTime="2025-11-29 05:45:50.479183698 +0000 UTC m=+1074.719692918" Nov 29 05:45:51 crc kubenswrapper[4594]: I1129 05:45:51.874041 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-87b885ff4-zwt2r" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.580770 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.582499 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.585776 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-x9tbh" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.586002 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.586171 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.593859 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.723983 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4pwv\" (UniqueName: \"kubernetes.io/projected/3629612b-cfc5-42bc-8584-4abc21ce4b3f-kube-api-access-c4pwv\") pod \"openstackclient\" (UID: \"3629612b-cfc5-42bc-8584-4abc21ce4b3f\") " pod="openstack/openstackclient" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.724050 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3629612b-cfc5-42bc-8584-4abc21ce4b3f-openstack-config\") pod \"openstackclient\" (UID: \"3629612b-cfc5-42bc-8584-4abc21ce4b3f\") " pod="openstack/openstackclient" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.724139 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3629612b-cfc5-42bc-8584-4abc21ce4b3f-openstack-config-secret\") pod \"openstackclient\" (UID: \"3629612b-cfc5-42bc-8584-4abc21ce4b3f\") " pod="openstack/openstackclient" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.724160 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3629612b-cfc5-42bc-8584-4abc21ce4b3f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3629612b-cfc5-42bc-8584-4abc21ce4b3f\") " pod="openstack/openstackclient" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.769670 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.825088 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3629612b-cfc5-42bc-8584-4abc21ce4b3f-openstack-config\") pod \"openstackclient\" (UID: \"3629612b-cfc5-42bc-8584-4abc21ce4b3f\") " pod="openstack/openstackclient" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.825180 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3629612b-cfc5-42bc-8584-4abc21ce4b3f-openstack-config-secret\") pod \"openstackclient\" (UID: \"3629612b-cfc5-42bc-8584-4abc21ce4b3f\") " pod="openstack/openstackclient" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.825203 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3629612b-cfc5-42bc-8584-4abc21ce4b3f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3629612b-cfc5-42bc-8584-4abc21ce4b3f\") " pod="openstack/openstackclient" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.825284 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4pwv\" (UniqueName: \"kubernetes.io/projected/3629612b-cfc5-42bc-8584-4abc21ce4b3f-kube-api-access-c4pwv\") pod \"openstackclient\" (UID: \"3629612b-cfc5-42bc-8584-4abc21ce4b3f\") " pod="openstack/openstackclient" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.826319 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3629612b-cfc5-42bc-8584-4abc21ce4b3f-openstack-config\") pod \"openstackclient\" (UID: \"3629612b-cfc5-42bc-8584-4abc21ce4b3f\") " pod="openstack/openstackclient" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.834176 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3629612b-cfc5-42bc-8584-4abc21ce4b3f-openstack-config-secret\") pod \"openstackclient\" (UID: \"3629612b-cfc5-42bc-8584-4abc21ce4b3f\") " pod="openstack/openstackclient" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.837819 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3629612b-cfc5-42bc-8584-4abc21ce4b3f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3629612b-cfc5-42bc-8584-4abc21ce4b3f\") " pod="openstack/openstackclient" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.838659 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4pwv\" (UniqueName: \"kubernetes.io/projected/3629612b-cfc5-42bc-8584-4abc21ce4b3f-kube-api-access-c4pwv\") pod \"openstackclient\" (UID: \"3629612b-cfc5-42bc-8584-4abc21ce4b3f\") " pod="openstack/openstackclient" Nov 29 05:45:53 crc kubenswrapper[4594]: I1129 05:45:53.897955 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 05:45:54 crc kubenswrapper[4594]: I1129 05:45:54.348036 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 29 05:45:54 crc kubenswrapper[4594]: W1129 05:45:54.349388 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3629612b_cfc5_42bc_8584_4abc21ce4b3f.slice/crio-46dabe920707d804d65c3cb7652110ec4bc4c0c67a9934d7d8bf4dcaf1db4a81 WatchSource:0}: Error finding container 46dabe920707d804d65c3cb7652110ec4bc4c0c67a9934d7d8bf4dcaf1db4a81: Status 404 returned error can't find the container with id 46dabe920707d804d65c3cb7652110ec4bc4c0c67a9934d7d8bf4dcaf1db4a81 Nov 29 05:45:54 crc kubenswrapper[4594]: I1129 05:45:54.493947 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3629612b-cfc5-42bc-8584-4abc21ce4b3f","Type":"ContainerStarted","Data":"46dabe920707d804d65c3cb7652110ec4bc4c0c67a9934d7d8bf4dcaf1db4a81"} Nov 29 05:45:56 crc kubenswrapper[4594]: I1129 05:45:56.168506 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7fb8849f48-9rr52" podUID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Nov 29 05:45:56 crc kubenswrapper[4594]: I1129 05:45:56.168880 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.039061 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7d689f55f9-c4bt7"] Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.040826 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.042993 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.043540 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.043734 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.052780 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7d689f55f9-c4bt7"] Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.084078 4594 scope.go:117] "RemoveContainer" containerID="327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5" Nov 29 05:45:57 crc kubenswrapper[4594]: E1129 05:45:57.084546 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3d60aa94-3720-43b5-a5de-ca30dd1b63b2)\"" pod="openstack/watcher-decision-engine-0" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.107148 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb181390-82bf-4bd9-9063-9272988db515-log-httpd\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.107384 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb181390-82bf-4bd9-9063-9272988db515-run-httpd\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.107545 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb181390-82bf-4bd9-9063-9272988db515-combined-ca-bundle\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.107827 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb181390-82bf-4bd9-9063-9272988db515-config-data\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.107865 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bb181390-82bf-4bd9-9063-9272988db515-etc-swift\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.107888 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb181390-82bf-4bd9-9063-9272988db515-public-tls-certs\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.107912 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cxqc\" (UniqueName: \"kubernetes.io/projected/bb181390-82bf-4bd9-9063-9272988db515-kube-api-access-7cxqc\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.108232 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb181390-82bf-4bd9-9063-9272988db515-internal-tls-certs\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.210314 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb181390-82bf-4bd9-9063-9272988db515-internal-tls-certs\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.210477 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb181390-82bf-4bd9-9063-9272988db515-log-httpd\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.210575 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb181390-82bf-4bd9-9063-9272988db515-run-httpd\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.210644 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb181390-82bf-4bd9-9063-9272988db515-combined-ca-bundle\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.210673 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb181390-82bf-4bd9-9063-9272988db515-config-data\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.210689 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bb181390-82bf-4bd9-9063-9272988db515-etc-swift\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.210707 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb181390-82bf-4bd9-9063-9272988db515-public-tls-certs\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.210728 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cxqc\" (UniqueName: \"kubernetes.io/projected/bb181390-82bf-4bd9-9063-9272988db515-kube-api-access-7cxqc\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.210975 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb181390-82bf-4bd9-9063-9272988db515-log-httpd\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.211046 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb181390-82bf-4bd9-9063-9272988db515-run-httpd\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.217442 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bb181390-82bf-4bd9-9063-9272988db515-etc-swift\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.217837 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb181390-82bf-4bd9-9063-9272988db515-combined-ca-bundle\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.218129 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb181390-82bf-4bd9-9063-9272988db515-config-data\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.218518 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb181390-82bf-4bd9-9063-9272988db515-public-tls-certs\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.218980 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb181390-82bf-4bd9-9063-9272988db515-internal-tls-certs\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.225076 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cxqc\" (UniqueName: \"kubernetes.io/projected/bb181390-82bf-4bd9-9063-9272988db515-kube-api-access-7cxqc\") pod \"swift-proxy-7d689f55f9-c4bt7\" (UID: \"bb181390-82bf-4bd9-9063-9272988db515\") " pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.359517 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.677490 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.678354 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="ceilometer-central-agent" containerID="cri-o://91ef0b975213ecd49a4cb8ca0772a78a05fca16694fee4cbd1819c22d2897038" gracePeriod=30 Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.678466 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="sg-core" containerID="cri-o://3922cd998fc7437f66c830a0c2b287e9cf0c2540e2ea914865cd3ec535ffb488" gracePeriod=30 Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.678539 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="ceilometer-notification-agent" containerID="cri-o://2a50d6f15b24a6202f9345ab29196faeb824eab5b07fca25dbb884c0c2ae9572" gracePeriod=30 Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.678624 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="proxy-httpd" containerID="cri-o://6a42b35d40824084353b7d5e1e21a712c35179bbaf94ceba9ed648d6e2afc1f2" gracePeriod=30 Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.692814 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 29 05:45:57 crc kubenswrapper[4594]: I1129 05:45:57.918102 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7d689f55f9-c4bt7"] Nov 29 05:45:57 crc kubenswrapper[4594]: W1129 05:45:57.928067 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb181390_82bf_4bd9_9063_9272988db515.slice/crio-a09ad0d1b64e79b21bee2ad31763bf812e0418e8d0c40d8ec8604e3a3307607d WatchSource:0}: Error finding container a09ad0d1b64e79b21bee2ad31763bf812e0418e8d0c40d8ec8604e3a3307607d: Status 404 returned error can't find the container with id a09ad0d1b64e79b21bee2ad31763bf812e0418e8d0c40d8ec8604e3a3307607d Nov 29 05:45:58 crc kubenswrapper[4594]: I1129 05:45:58.556474 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7d689f55f9-c4bt7" event={"ID":"bb181390-82bf-4bd9-9063-9272988db515","Type":"ContainerStarted","Data":"76492a17bc67f781fb681e9dc25594512f5ae29a17124bd44062d0c575e1f1cb"} Nov 29 05:45:58 crc kubenswrapper[4594]: I1129 05:45:58.556789 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7d689f55f9-c4bt7" event={"ID":"bb181390-82bf-4bd9-9063-9272988db515","Type":"ContainerStarted","Data":"a20fa1b9096d61c71d1f294b4642ba1c331111227eb81a832f5762e979099106"} Nov 29 05:45:58 crc kubenswrapper[4594]: I1129 05:45:58.556816 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:58 crc kubenswrapper[4594]: I1129 05:45:58.556831 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7d689f55f9-c4bt7" event={"ID":"bb181390-82bf-4bd9-9063-9272988db515","Type":"ContainerStarted","Data":"a09ad0d1b64e79b21bee2ad31763bf812e0418e8d0c40d8ec8604e3a3307607d"} Nov 29 05:45:58 crc kubenswrapper[4594]: I1129 05:45:58.561087 4594 generic.go:334] "Generic (PLEG): container finished" podID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerID="6a42b35d40824084353b7d5e1e21a712c35179bbaf94ceba9ed648d6e2afc1f2" exitCode=0 Nov 29 05:45:58 crc kubenswrapper[4594]: I1129 05:45:58.561130 4594 generic.go:334] "Generic (PLEG): container finished" podID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerID="3922cd998fc7437f66c830a0c2b287e9cf0c2540e2ea914865cd3ec535ffb488" exitCode=2 Nov 29 05:45:58 crc kubenswrapper[4594]: I1129 05:45:58.561143 4594 generic.go:334] "Generic (PLEG): container finished" podID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerID="91ef0b975213ecd49a4cb8ca0772a78a05fca16694fee4cbd1819c22d2897038" exitCode=0 Nov 29 05:45:58 crc kubenswrapper[4594]: I1129 05:45:58.561185 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef00ff38-96ae-42ba-8c5f-05c73ec60467","Type":"ContainerDied","Data":"6a42b35d40824084353b7d5e1e21a712c35179bbaf94ceba9ed648d6e2afc1f2"} Nov 29 05:45:58 crc kubenswrapper[4594]: I1129 05:45:58.561285 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef00ff38-96ae-42ba-8c5f-05c73ec60467","Type":"ContainerDied","Data":"3922cd998fc7437f66c830a0c2b287e9cf0c2540e2ea914865cd3ec535ffb488"} Nov 29 05:45:58 crc kubenswrapper[4594]: I1129 05:45:58.561303 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef00ff38-96ae-42ba-8c5f-05c73ec60467","Type":"ContainerDied","Data":"91ef0b975213ecd49a4cb8ca0772a78a05fca16694fee4cbd1819c22d2897038"} Nov 29 05:45:58 crc kubenswrapper[4594]: I1129 05:45:58.580118 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7d689f55f9-c4bt7" podStartSLOduration=1.580103552 podStartE2EDuration="1.580103552s" podCreationTimestamp="2025-11-29 05:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:45:58.574550797 +0000 UTC m=+1082.815060016" watchObservedRunningTime="2025-11-29 05:45:58.580103552 +0000 UTC m=+1082.820612772" Nov 29 05:45:58 crc kubenswrapper[4594]: I1129 05:45:58.944591 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 29 05:45:59 crc kubenswrapper[4594]: I1129 05:45:59.578231 4594 generic.go:334] "Generic (PLEG): container finished" podID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerID="2a50d6f15b24a6202f9345ab29196faeb824eab5b07fca25dbb884c0c2ae9572" exitCode=0 Nov 29 05:45:59 crc kubenswrapper[4594]: I1129 05:45:59.578513 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef00ff38-96ae-42ba-8c5f-05c73ec60467","Type":"ContainerDied","Data":"2a50d6f15b24a6202f9345ab29196faeb824eab5b07fca25dbb884c0c2ae9572"} Nov 29 05:45:59 crc kubenswrapper[4594]: I1129 05:45:59.578926 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:45:59 crc kubenswrapper[4594]: I1129 05:45:59.928393 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.107230 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-scripts\") pod \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.107329 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-sg-core-conf-yaml\") pod \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.107389 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvhsq\" (UniqueName: \"kubernetes.io/projected/ef00ff38-96ae-42ba-8c5f-05c73ec60467-kube-api-access-rvhsq\") pod \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.107504 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-config-data\") pod \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.107549 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef00ff38-96ae-42ba-8c5f-05c73ec60467-run-httpd\") pod \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.107704 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-combined-ca-bundle\") pod \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.107929 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef00ff38-96ae-42ba-8c5f-05c73ec60467-log-httpd\") pod \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\" (UID: \"ef00ff38-96ae-42ba-8c5f-05c73ec60467\") " Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.109126 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef00ff38-96ae-42ba-8c5f-05c73ec60467-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ef00ff38-96ae-42ba-8c5f-05c73ec60467" (UID: "ef00ff38-96ae-42ba-8c5f-05c73ec60467"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.110744 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef00ff38-96ae-42ba-8c5f-05c73ec60467-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ef00ff38-96ae-42ba-8c5f-05c73ec60467" (UID: "ef00ff38-96ae-42ba-8c5f-05c73ec60467"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.128715 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-scripts" (OuterVolumeSpecName: "scripts") pod "ef00ff38-96ae-42ba-8c5f-05c73ec60467" (UID: "ef00ff38-96ae-42ba-8c5f-05c73ec60467"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.128806 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef00ff38-96ae-42ba-8c5f-05c73ec60467-kube-api-access-rvhsq" (OuterVolumeSpecName: "kube-api-access-rvhsq") pod "ef00ff38-96ae-42ba-8c5f-05c73ec60467" (UID: "ef00ff38-96ae-42ba-8c5f-05c73ec60467"). InnerVolumeSpecName "kube-api-access-rvhsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.139967 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ef00ff38-96ae-42ba-8c5f-05c73ec60467" (UID: "ef00ff38-96ae-42ba-8c5f-05c73ec60467"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.197056 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef00ff38-96ae-42ba-8c5f-05c73ec60467" (UID: "ef00ff38-96ae-42ba-8c5f-05c73ec60467"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.211105 4594 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef00ff38-96ae-42ba-8c5f-05c73ec60467-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.211136 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.211151 4594 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef00ff38-96ae-42ba-8c5f-05c73ec60467-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.211164 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.211174 4594 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.211184 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvhsq\" (UniqueName: \"kubernetes.io/projected/ef00ff38-96ae-42ba-8c5f-05c73ec60467-kube-api-access-rvhsq\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.234160 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-config-data" (OuterVolumeSpecName: "config-data") pod "ef00ff38-96ae-42ba-8c5f-05c73ec60467" (UID: "ef00ff38-96ae-42ba-8c5f-05c73ec60467"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.313655 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef00ff38-96ae-42ba-8c5f-05c73ec60467-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.594292 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef00ff38-96ae-42ba-8c5f-05c73ec60467","Type":"ContainerDied","Data":"cfffcbedf88fdaa485b1309f5b941f1aeb5e264d8fb7165246f8b0f80f423f1e"} Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.594357 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.594385 4594 scope.go:117] "RemoveContainer" containerID="6a42b35d40824084353b7d5e1e21a712c35179bbaf94ceba9ed648d6e2afc1f2" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.641506 4594 scope.go:117] "RemoveContainer" containerID="3922cd998fc7437f66c830a0c2b287e9cf0c2540e2ea914865cd3ec535ffb488" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.641020 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.652287 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.660347 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:00 crc kubenswrapper[4594]: E1129 05:46:00.660843 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="ceilometer-notification-agent" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.660863 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="ceilometer-notification-agent" Nov 29 05:46:00 crc kubenswrapper[4594]: E1129 05:46:00.660880 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="sg-core" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.660886 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="sg-core" Nov 29 05:46:00 crc kubenswrapper[4594]: E1129 05:46:00.660905 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="ceilometer-central-agent" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.660911 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="ceilometer-central-agent" Nov 29 05:46:00 crc kubenswrapper[4594]: E1129 05:46:00.660936 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="proxy-httpd" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.660942 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="proxy-httpd" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.661155 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="proxy-httpd" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.661184 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="sg-core" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.661194 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="ceilometer-central-agent" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.661207 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" containerName="ceilometer-notification-agent" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.663105 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.665761 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.666672 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.671442 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.678139 4594 scope.go:117] "RemoveContainer" containerID="2a50d6f15b24a6202f9345ab29196faeb824eab5b07fca25dbb884c0c2ae9572" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.829590 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-scripts\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.829878 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.830028 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd9bh\" (UniqueName: \"kubernetes.io/projected/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-kube-api-access-jd9bh\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.830192 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-run-httpd\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.830249 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.830400 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-config-data\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.830434 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-log-httpd\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.933228 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-run-httpd\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.933337 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.933426 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-config-data\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.933481 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-log-httpd\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.933547 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-scripts\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.933749 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.933881 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd9bh\" (UniqueName: \"kubernetes.io/projected/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-kube-api-access-jd9bh\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.933996 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-run-httpd\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.934438 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-log-httpd\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.943910 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-scripts\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.949226 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.950328 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-config-data\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.949560 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:00 crc kubenswrapper[4594]: I1129 05:46:00.953895 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd9bh\" (UniqueName: \"kubernetes.io/projected/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-kube-api-access-jd9bh\") pod \"ceilometer-0\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " pod="openstack/ceilometer-0" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.001896 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.082135 4594 scope.go:117] "RemoveContainer" containerID="91ef0b975213ecd49a4cb8ca0772a78a05fca16694fee4cbd1819c22d2897038" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.387424 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.540174 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:01 crc kubenswrapper[4594]: W1129 05:46:01.541166 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdeb9d7d_ae93_49e5_a3f2_128a4cc17353.slice/crio-6e456579ee9080aa5f16fbcea1e1244ad9f63f2e5ce1780879746d6ae436a970 WatchSource:0}: Error finding container 6e456579ee9080aa5f16fbcea1e1244ad9f63f2e5ce1780879746d6ae436a970: Status 404 returned error can't find the container with id 6e456579ee9080aa5f16fbcea1e1244ad9f63f2e5ce1780879746d6ae436a970 Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.547860 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7qgf\" (UniqueName: \"kubernetes.io/projected/e8d01e45-1d76-464f-93e6-965d65a055fc-kube-api-access-l7qgf\") pod \"e8d01e45-1d76-464f-93e6-965d65a055fc\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.548913 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8d01e45-1d76-464f-93e6-965d65a055fc-scripts\") pod \"e8d01e45-1d76-464f-93e6-965d65a055fc\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.549094 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-horizon-secret-key\") pod \"e8d01e45-1d76-464f-93e6-965d65a055fc\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.549563 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-combined-ca-bundle\") pod \"e8d01e45-1d76-464f-93e6-965d65a055fc\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.549795 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-horizon-tls-certs\") pod \"e8d01e45-1d76-464f-93e6-965d65a055fc\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.549924 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8d01e45-1d76-464f-93e6-965d65a055fc-config-data\") pod \"e8d01e45-1d76-464f-93e6-965d65a055fc\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.550008 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8d01e45-1d76-464f-93e6-965d65a055fc-logs\") pod \"e8d01e45-1d76-464f-93e6-965d65a055fc\" (UID: \"e8d01e45-1d76-464f-93e6-965d65a055fc\") " Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.550579 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8d01e45-1d76-464f-93e6-965d65a055fc-logs" (OuterVolumeSpecName: "logs") pod "e8d01e45-1d76-464f-93e6-965d65a055fc" (UID: "e8d01e45-1d76-464f-93e6-965d65a055fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.550902 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8d01e45-1d76-464f-93e6-965d65a055fc-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.554923 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e8d01e45-1d76-464f-93e6-965d65a055fc" (UID: "e8d01e45-1d76-464f-93e6-965d65a055fc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.555046 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8d01e45-1d76-464f-93e6-965d65a055fc-kube-api-access-l7qgf" (OuterVolumeSpecName: "kube-api-access-l7qgf") pod "e8d01e45-1d76-464f-93e6-965d65a055fc" (UID: "e8d01e45-1d76-464f-93e6-965d65a055fc"). InnerVolumeSpecName "kube-api-access-l7qgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.570651 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8d01e45-1d76-464f-93e6-965d65a055fc-config-data" (OuterVolumeSpecName: "config-data") pod "e8d01e45-1d76-464f-93e6-965d65a055fc" (UID: "e8d01e45-1d76-464f-93e6-965d65a055fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.571268 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8d01e45-1d76-464f-93e6-965d65a055fc-scripts" (OuterVolumeSpecName: "scripts") pod "e8d01e45-1d76-464f-93e6-965d65a055fc" (UID: "e8d01e45-1d76-464f-93e6-965d65a055fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.582827 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8d01e45-1d76-464f-93e6-965d65a055fc" (UID: "e8d01e45-1d76-464f-93e6-965d65a055fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.601962 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "e8d01e45-1d76-464f-93e6-965d65a055fc" (UID: "e8d01e45-1d76-464f-93e6-965d65a055fc"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.611602 4594 generic.go:334] "Generic (PLEG): container finished" podID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerID="5ac36bd5450c58ccb44f66fa2fc8a56f6b7dd613a59651336f4d79e162161576" exitCode=137 Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.611673 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb8849f48-9rr52" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.611683 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8849f48-9rr52" event={"ID":"e8d01e45-1d76-464f-93e6-965d65a055fc","Type":"ContainerDied","Data":"5ac36bd5450c58ccb44f66fa2fc8a56f6b7dd613a59651336f4d79e162161576"} Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.611727 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8849f48-9rr52" event={"ID":"e8d01e45-1d76-464f-93e6-965d65a055fc","Type":"ContainerDied","Data":"cbea952840713cbdc824cef609595cd34ec189700a5184841e64c76a84b7656b"} Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.611760 4594 scope.go:117] "RemoveContainer" containerID="0df5f632c7bce5417787cef5afd154fcfdf04ab056713da4bdcdd13461f932ca" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.613324 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353","Type":"ContainerStarted","Data":"6e456579ee9080aa5f16fbcea1e1244ad9f63f2e5ce1780879746d6ae436a970"} Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.656059 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7qgf\" (UniqueName: \"kubernetes.io/projected/e8d01e45-1d76-464f-93e6-965d65a055fc-kube-api-access-l7qgf\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.656115 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8d01e45-1d76-464f-93e6-965d65a055fc-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.656132 4594 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.656141 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.656150 4594 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8d01e45-1d76-464f-93e6-965d65a055fc-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.656157 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8d01e45-1d76-464f-93e6-965d65a055fc-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.681557 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7fb8849f48-9rr52"] Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.697318 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7fb8849f48-9rr52"] Nov 29 05:46:01 crc kubenswrapper[4594]: I1129 05:46:01.819893 4594 scope.go:117] "RemoveContainer" containerID="5ac36bd5450c58ccb44f66fa2fc8a56f6b7dd613a59651336f4d79e162161576" Nov 29 05:46:02 crc kubenswrapper[4594]: I1129 05:46:02.098459 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8d01e45-1d76-464f-93e6-965d65a055fc" path="/var/lib/kubelet/pods/e8d01e45-1d76-464f-93e6-965d65a055fc/volumes" Nov 29 05:46:02 crc kubenswrapper[4594]: I1129 05:46:02.099647 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef00ff38-96ae-42ba-8c5f-05c73ec60467" path="/var/lib/kubelet/pods/ef00ff38-96ae-42ba-8c5f-05c73ec60467/volumes" Nov 29 05:46:04 crc kubenswrapper[4594]: I1129 05:46:04.325875 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:04 crc kubenswrapper[4594]: I1129 05:46:04.326363 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:04 crc kubenswrapper[4594]: I1129 05:46:04.327673 4594 scope.go:117] "RemoveContainer" containerID="327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5" Nov 29 05:46:04 crc kubenswrapper[4594]: I1129 05:46:04.552801 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:04 crc kubenswrapper[4594]: I1129 05:46:04.951150 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-dmbbc"] Nov 29 05:46:04 crc kubenswrapper[4594]: E1129 05:46:04.957060 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerName="horizon-log" Nov 29 05:46:04 crc kubenswrapper[4594]: I1129 05:46:04.957185 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerName="horizon-log" Nov 29 05:46:04 crc kubenswrapper[4594]: E1129 05:46:04.957297 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerName="horizon" Nov 29 05:46:04 crc kubenswrapper[4594]: I1129 05:46:04.957362 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerName="horizon" Nov 29 05:46:04 crc kubenswrapper[4594]: I1129 05:46:04.981278 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerName="horizon-log" Nov 29 05:46:04 crc kubenswrapper[4594]: I1129 05:46:04.981345 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d01e45-1d76-464f-93e6-965d65a055fc" containerName="horizon" Nov 29 05:46:04 crc kubenswrapper[4594]: I1129 05:46:04.982477 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-dmbbc"] Nov 29 05:46:04 crc kubenswrapper[4594]: I1129 05:46:04.982602 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dmbbc" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.043973 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxjd5\" (UniqueName: \"kubernetes.io/projected/3371a852-7321-4c19-896c-a31265ebf283-kube-api-access-cxjd5\") pod \"nova-api-db-create-dmbbc\" (UID: \"3371a852-7321-4c19-896c-a31265ebf283\") " pod="openstack/nova-api-db-create-dmbbc" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.044160 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3371a852-7321-4c19-896c-a31265ebf283-operator-scripts\") pod \"nova-api-db-create-dmbbc\" (UID: \"3371a852-7321-4c19-896c-a31265ebf283\") " pod="openstack/nova-api-db-create-dmbbc" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.065025 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-7b8a-account-create-update-r8lrn"] Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.066738 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7b8a-account-create-update-r8lrn" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.078950 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.093510 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-2btnd"] Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.095982 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2btnd" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.105093 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2btnd"] Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.122752 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7b8a-account-create-update-r8lrn"] Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.154584 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f3d7959-7013-4279-8173-077ed6cefbda-operator-scripts\") pod \"nova-cell0-db-create-2btnd\" (UID: \"5f3d7959-7013-4279-8173-077ed6cefbda\") " pod="openstack/nova-cell0-db-create-2btnd" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.154860 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz4qr\" (UniqueName: \"kubernetes.io/projected/bbfcad00-10e0-4501-b728-449a2a2f03b0-kube-api-access-sz4qr\") pod \"nova-api-7b8a-account-create-update-r8lrn\" (UID: \"bbfcad00-10e0-4501-b728-449a2a2f03b0\") " pod="openstack/nova-api-7b8a-account-create-update-r8lrn" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.154992 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxjd5\" (UniqueName: \"kubernetes.io/projected/3371a852-7321-4c19-896c-a31265ebf283-kube-api-access-cxjd5\") pod \"nova-api-db-create-dmbbc\" (UID: \"3371a852-7321-4c19-896c-a31265ebf283\") " pod="openstack/nova-api-db-create-dmbbc" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.155136 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3371a852-7321-4c19-896c-a31265ebf283-operator-scripts\") pod \"nova-api-db-create-dmbbc\" (UID: \"3371a852-7321-4c19-896c-a31265ebf283\") " pod="openstack/nova-api-db-create-dmbbc" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.155291 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbfcad00-10e0-4501-b728-449a2a2f03b0-operator-scripts\") pod \"nova-api-7b8a-account-create-update-r8lrn\" (UID: \"bbfcad00-10e0-4501-b728-449a2a2f03b0\") " pod="openstack/nova-api-7b8a-account-create-update-r8lrn" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.155382 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxjc4\" (UniqueName: \"kubernetes.io/projected/5f3d7959-7013-4279-8173-077ed6cefbda-kube-api-access-vxjc4\") pod \"nova-cell0-db-create-2btnd\" (UID: \"5f3d7959-7013-4279-8173-077ed6cefbda\") " pod="openstack/nova-cell0-db-create-2btnd" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.156367 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3371a852-7321-4c19-896c-a31265ebf283-operator-scripts\") pod \"nova-api-db-create-dmbbc\" (UID: \"3371a852-7321-4c19-896c-a31265ebf283\") " pod="openstack/nova-api-db-create-dmbbc" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.176472 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxjd5\" (UniqueName: \"kubernetes.io/projected/3371a852-7321-4c19-896c-a31265ebf283-kube-api-access-cxjd5\") pod \"nova-api-db-create-dmbbc\" (UID: \"3371a852-7321-4c19-896c-a31265ebf283\") " pod="openstack/nova-api-db-create-dmbbc" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.258029 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbfcad00-10e0-4501-b728-449a2a2f03b0-operator-scripts\") pod \"nova-api-7b8a-account-create-update-r8lrn\" (UID: \"bbfcad00-10e0-4501-b728-449a2a2f03b0\") " pod="openstack/nova-api-7b8a-account-create-update-r8lrn" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.258414 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxjc4\" (UniqueName: \"kubernetes.io/projected/5f3d7959-7013-4279-8173-077ed6cefbda-kube-api-access-vxjc4\") pod \"nova-cell0-db-create-2btnd\" (UID: \"5f3d7959-7013-4279-8173-077ed6cefbda\") " pod="openstack/nova-cell0-db-create-2btnd" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.258531 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f3d7959-7013-4279-8173-077ed6cefbda-operator-scripts\") pod \"nova-cell0-db-create-2btnd\" (UID: \"5f3d7959-7013-4279-8173-077ed6cefbda\") " pod="openstack/nova-cell0-db-create-2btnd" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.258627 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz4qr\" (UniqueName: \"kubernetes.io/projected/bbfcad00-10e0-4501-b728-449a2a2f03b0-kube-api-access-sz4qr\") pod \"nova-api-7b8a-account-create-update-r8lrn\" (UID: \"bbfcad00-10e0-4501-b728-449a2a2f03b0\") " pod="openstack/nova-api-7b8a-account-create-update-r8lrn" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.259181 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbfcad00-10e0-4501-b728-449a2a2f03b0-operator-scripts\") pod \"nova-api-7b8a-account-create-update-r8lrn\" (UID: \"bbfcad00-10e0-4501-b728-449a2a2f03b0\") " pod="openstack/nova-api-7b8a-account-create-update-r8lrn" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.259705 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f3d7959-7013-4279-8173-077ed6cefbda-operator-scripts\") pod \"nova-cell0-db-create-2btnd\" (UID: \"5f3d7959-7013-4279-8173-077ed6cefbda\") " pod="openstack/nova-cell0-db-create-2btnd" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.268092 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-pj4xh"] Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.269601 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pj4xh" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.281263 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-pj4xh"] Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.282973 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz4qr\" (UniqueName: \"kubernetes.io/projected/bbfcad00-10e0-4501-b728-449a2a2f03b0-kube-api-access-sz4qr\") pod \"nova-api-7b8a-account-create-update-r8lrn\" (UID: \"bbfcad00-10e0-4501-b728-449a2a2f03b0\") " pod="openstack/nova-api-7b8a-account-create-update-r8lrn" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.288605 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxjc4\" (UniqueName: \"kubernetes.io/projected/5f3d7959-7013-4279-8173-077ed6cefbda-kube-api-access-vxjc4\") pod \"nova-cell0-db-create-2btnd\" (UID: \"5f3d7959-7013-4279-8173-077ed6cefbda\") " pod="openstack/nova-cell0-db-create-2btnd" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.295354 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b707-account-create-update-x94hq"] Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.297049 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b707-account-create-update-x94hq" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.298407 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.307079 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b707-account-create-update-x94hq"] Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.308889 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dmbbc" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.359356 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92b184e0-f52c-47d3-8488-b9e8e8552b1c-operator-scripts\") pod \"nova-cell1-db-create-pj4xh\" (UID: \"92b184e0-f52c-47d3-8488-b9e8e8552b1c\") " pod="openstack/nova-cell1-db-create-pj4xh" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.359434 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2abc53c5-05f4-457c-be49-c34e962a9522-operator-scripts\") pod \"nova-cell0-b707-account-create-update-x94hq\" (UID: \"2abc53c5-05f4-457c-be49-c34e962a9522\") " pod="openstack/nova-cell0-b707-account-create-update-x94hq" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.359497 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq7gd\" (UniqueName: \"kubernetes.io/projected/92b184e0-f52c-47d3-8488-b9e8e8552b1c-kube-api-access-fq7gd\") pod \"nova-cell1-db-create-pj4xh\" (UID: \"92b184e0-f52c-47d3-8488-b9e8e8552b1c\") " pod="openstack/nova-cell1-db-create-pj4xh" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.359633 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26rb2\" (UniqueName: \"kubernetes.io/projected/2abc53c5-05f4-457c-be49-c34e962a9522-kube-api-access-26rb2\") pod \"nova-cell0-b707-account-create-update-x94hq\" (UID: \"2abc53c5-05f4-457c-be49-c34e962a9522\") " pod="openstack/nova-cell0-b707-account-create-update-x94hq" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.404215 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7b8a-account-create-update-r8lrn" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.416792 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2btnd" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.457231 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6990-account-create-update-w9h2r"] Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.458769 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6990-account-create-update-w9h2r" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.461173 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.461674 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2abc53c5-05f4-457c-be49-c34e962a9522-operator-scripts\") pod \"nova-cell0-b707-account-create-update-x94hq\" (UID: \"2abc53c5-05f4-457c-be49-c34e962a9522\") " pod="openstack/nova-cell0-b707-account-create-update-x94hq" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.461814 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq7gd\" (UniqueName: \"kubernetes.io/projected/92b184e0-f52c-47d3-8488-b9e8e8552b1c-kube-api-access-fq7gd\") pod \"nova-cell1-db-create-pj4xh\" (UID: \"92b184e0-f52c-47d3-8488-b9e8e8552b1c\") " pod="openstack/nova-cell1-db-create-pj4xh" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.461861 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26rb2\" (UniqueName: \"kubernetes.io/projected/2abc53c5-05f4-457c-be49-c34e962a9522-kube-api-access-26rb2\") pod \"nova-cell0-b707-account-create-update-x94hq\" (UID: \"2abc53c5-05f4-457c-be49-c34e962a9522\") " pod="openstack/nova-cell0-b707-account-create-update-x94hq" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.462063 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92b184e0-f52c-47d3-8488-b9e8e8552b1c-operator-scripts\") pod \"nova-cell1-db-create-pj4xh\" (UID: \"92b184e0-f52c-47d3-8488-b9e8e8552b1c\") " pod="openstack/nova-cell1-db-create-pj4xh" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.462922 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2abc53c5-05f4-457c-be49-c34e962a9522-operator-scripts\") pod \"nova-cell0-b707-account-create-update-x94hq\" (UID: \"2abc53c5-05f4-457c-be49-c34e962a9522\") " pod="openstack/nova-cell0-b707-account-create-update-x94hq" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.463035 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92b184e0-f52c-47d3-8488-b9e8e8552b1c-operator-scripts\") pod \"nova-cell1-db-create-pj4xh\" (UID: \"92b184e0-f52c-47d3-8488-b9e8e8552b1c\") " pod="openstack/nova-cell1-db-create-pj4xh" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.467963 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6990-account-create-update-w9h2r"] Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.484625 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26rb2\" (UniqueName: \"kubernetes.io/projected/2abc53c5-05f4-457c-be49-c34e962a9522-kube-api-access-26rb2\") pod \"nova-cell0-b707-account-create-update-x94hq\" (UID: \"2abc53c5-05f4-457c-be49-c34e962a9522\") " pod="openstack/nova-cell0-b707-account-create-update-x94hq" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.487373 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq7gd\" (UniqueName: \"kubernetes.io/projected/92b184e0-f52c-47d3-8488-b9e8e8552b1c-kube-api-access-fq7gd\") pod \"nova-cell1-db-create-pj4xh\" (UID: \"92b184e0-f52c-47d3-8488-b9e8e8552b1c\") " pod="openstack/nova-cell1-db-create-pj4xh" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.564190 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada5c910-bb44-43e9-9589-649f2226ebab-operator-scripts\") pod \"nova-cell1-6990-account-create-update-w9h2r\" (UID: \"ada5c910-bb44-43e9-9589-649f2226ebab\") " pod="openstack/nova-cell1-6990-account-create-update-w9h2r" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.564498 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdnlh\" (UniqueName: \"kubernetes.io/projected/ada5c910-bb44-43e9-9589-649f2226ebab-kube-api-access-mdnlh\") pod \"nova-cell1-6990-account-create-update-w9h2r\" (UID: \"ada5c910-bb44-43e9-9589-649f2226ebab\") " pod="openstack/nova-cell1-6990-account-create-update-w9h2r" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.665574 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pj4xh" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.667878 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada5c910-bb44-43e9-9589-649f2226ebab-operator-scripts\") pod \"nova-cell1-6990-account-create-update-w9h2r\" (UID: \"ada5c910-bb44-43e9-9589-649f2226ebab\") " pod="openstack/nova-cell1-6990-account-create-update-w9h2r" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.668033 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdnlh\" (UniqueName: \"kubernetes.io/projected/ada5c910-bb44-43e9-9589-649f2226ebab-kube-api-access-mdnlh\") pod \"nova-cell1-6990-account-create-update-w9h2r\" (UID: \"ada5c910-bb44-43e9-9589-649f2226ebab\") " pod="openstack/nova-cell1-6990-account-create-update-w9h2r" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.668616 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada5c910-bb44-43e9-9589-649f2226ebab-operator-scripts\") pod \"nova-cell1-6990-account-create-update-w9h2r\" (UID: \"ada5c910-bb44-43e9-9589-649f2226ebab\") " pod="openstack/nova-cell1-6990-account-create-update-w9h2r" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.673821 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b707-account-create-update-x94hq" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.680669 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdnlh\" (UniqueName: \"kubernetes.io/projected/ada5c910-bb44-43e9-9589-649f2226ebab-kube-api-access-mdnlh\") pod \"nova-cell1-6990-account-create-update-w9h2r\" (UID: \"ada5c910-bb44-43e9-9589-649f2226ebab\") " pod="openstack/nova-cell1-6990-account-create-update-w9h2r" Nov 29 05:46:05 crc kubenswrapper[4594]: I1129 05:46:05.776038 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6990-account-create-update-w9h2r" Nov 29 05:46:06 crc kubenswrapper[4594]: I1129 05:46:06.778325 4594 scope.go:117] "RemoveContainer" containerID="0df5f632c7bce5417787cef5afd154fcfdf04ab056713da4bdcdd13461f932ca" Nov 29 05:46:06 crc kubenswrapper[4594]: E1129 05:46:06.779719 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0df5f632c7bce5417787cef5afd154fcfdf04ab056713da4bdcdd13461f932ca\": container with ID starting with 0df5f632c7bce5417787cef5afd154fcfdf04ab056713da4bdcdd13461f932ca not found: ID does not exist" containerID="0df5f632c7bce5417787cef5afd154fcfdf04ab056713da4bdcdd13461f932ca" Nov 29 05:46:06 crc kubenswrapper[4594]: I1129 05:46:06.779773 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df5f632c7bce5417787cef5afd154fcfdf04ab056713da4bdcdd13461f932ca"} err="failed to get container status \"0df5f632c7bce5417787cef5afd154fcfdf04ab056713da4bdcdd13461f932ca\": rpc error: code = NotFound desc = could not find container \"0df5f632c7bce5417787cef5afd154fcfdf04ab056713da4bdcdd13461f932ca\": container with ID starting with 0df5f632c7bce5417787cef5afd154fcfdf04ab056713da4bdcdd13461f932ca not found: ID does not exist" Nov 29 05:46:06 crc kubenswrapper[4594]: I1129 05:46:06.779818 4594 scope.go:117] "RemoveContainer" containerID="5ac36bd5450c58ccb44f66fa2fc8a56f6b7dd613a59651336f4d79e162161576" Nov 29 05:46:06 crc kubenswrapper[4594]: E1129 05:46:06.780138 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ac36bd5450c58ccb44f66fa2fc8a56f6b7dd613a59651336f4d79e162161576\": container with ID starting with 5ac36bd5450c58ccb44f66fa2fc8a56f6b7dd613a59651336f4d79e162161576 not found: ID does not exist" containerID="5ac36bd5450c58ccb44f66fa2fc8a56f6b7dd613a59651336f4d79e162161576" Nov 29 05:46:06 crc kubenswrapper[4594]: I1129 05:46:06.780177 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ac36bd5450c58ccb44f66fa2fc8a56f6b7dd613a59651336f4d79e162161576"} err="failed to get container status \"5ac36bd5450c58ccb44f66fa2fc8a56f6b7dd613a59651336f4d79e162161576\": rpc error: code = NotFound desc = could not find container \"5ac36bd5450c58ccb44f66fa2fc8a56f6b7dd613a59651336f4d79e162161576\": container with ID starting with 5ac36bd5450c58ccb44f66fa2fc8a56f6b7dd613a59651336f4d79e162161576 not found: ID does not exist" Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.294450 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7b8a-account-create-update-r8lrn"] Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.368361 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.370692 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7d689f55f9-c4bt7" Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.484410 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2btnd"] Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.499689 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6990-account-create-update-w9h2r"] Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.650600 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b707-account-create-update-x94hq"] Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.668698 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-pj4xh"] Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.697628 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2btnd" event={"ID":"5f3d7959-7013-4279-8173-077ed6cefbda","Type":"ContainerStarted","Data":"4f5ba5d3e446ebf2f2b1334764f72d80a4e8907b61e38444cd339bc9f681ab5a"} Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.709844 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6990-account-create-update-w9h2r" event={"ID":"ada5c910-bb44-43e9-9589-649f2226ebab","Type":"ContainerStarted","Data":"a641bb97477180bb25c030ce487677eb690faf0269627e27560ca25e359cf991"} Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.717335 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b707-account-create-update-x94hq" event={"ID":"2abc53c5-05f4-457c-be49-c34e962a9522","Type":"ContainerStarted","Data":"b27b179c36ec661b3f54dbca2dec6ecf54ff5936be342dac71311cb2e27da215"} Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.721955 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353","Type":"ContainerStarted","Data":"88fb822152d931be0cec289aa8482d7f3c67e5ce6aad97d8be04955120761479"} Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.726365 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d60aa94-3720-43b5-a5de-ca30dd1b63b2","Type":"ContainerStarted","Data":"3629936ba22e800f31489d08bd273609b38199d12a0e515a370b697d753a24b2"} Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.727962 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3629612b-cfc5-42bc-8584-4abc21ce4b3f","Type":"ContainerStarted","Data":"bfd5b3b8d296d7d8888742544cc8a49abcb5e9562275b17f477ce15efff1c825"} Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.732503 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7b8a-account-create-update-r8lrn" event={"ID":"bbfcad00-10e0-4501-b728-449a2a2f03b0","Type":"ContainerStarted","Data":"276a56701b377bfceb696b0d8e95eaa28edb6a9e217c8abd01583090c35129c8"} Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.732536 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7b8a-account-create-update-r8lrn" event={"ID":"bbfcad00-10e0-4501-b728-449a2a2f03b0","Type":"ContainerStarted","Data":"d4888b143f7f8f781618dbf15a3c945977bec7ad3b4f18467fe8c26c46638ab6"} Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.764514 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-dmbbc"] Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.771900 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-7b8a-account-create-update-r8lrn" podStartSLOduration=2.7718770360000002 podStartE2EDuration="2.771877036s" podCreationTimestamp="2025-11-29 05:46:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:46:07.756737616 +0000 UTC m=+1091.997246836" watchObservedRunningTime="2025-11-29 05:46:07.771877036 +0000 UTC m=+1092.012386246" Nov 29 05:46:07 crc kubenswrapper[4594]: I1129 05:46:07.782469 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.215551106 podStartE2EDuration="14.782458944s" podCreationTimestamp="2025-11-29 05:45:53 +0000 UTC" firstStartedPulling="2025-11-29 05:45:54.352564104 +0000 UTC m=+1078.593073324" lastFinishedPulling="2025-11-29 05:46:06.919471942 +0000 UTC m=+1091.159981162" observedRunningTime="2025-11-29 05:46:07.766522795 +0000 UTC m=+1092.007032014" watchObservedRunningTime="2025-11-29 05:46:07.782458944 +0000 UTC m=+1092.022968164" Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.745347 4594 generic.go:334] "Generic (PLEG): container finished" podID="3371a852-7321-4c19-896c-a31265ebf283" containerID="15a7df3933c5684260924dc998b66227db0e746b52712f9fd0112ed9bceff343" exitCode=0 Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.745391 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dmbbc" event={"ID":"3371a852-7321-4c19-896c-a31265ebf283","Type":"ContainerDied","Data":"15a7df3933c5684260924dc998b66227db0e746b52712f9fd0112ed9bceff343"} Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.746030 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dmbbc" event={"ID":"3371a852-7321-4c19-896c-a31265ebf283","Type":"ContainerStarted","Data":"e43161f6527cc43f14ee40aebed41e15ff2773ecf3ba6fe240b7d4ba71820e88"} Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.748385 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353","Type":"ContainerStarted","Data":"3ad7251e66ba279f9ba73e88cc0726bdeae4e9f177beced32cda601809e8f140"} Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.750285 4594 generic.go:334] "Generic (PLEG): container finished" podID="5f3d7959-7013-4279-8173-077ed6cefbda" containerID="8d717d82a2740ec4060ca75c87c2a9ea54822b82b1ca4a0b1e192b27dbd9d9a1" exitCode=0 Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.750322 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2btnd" event={"ID":"5f3d7959-7013-4279-8173-077ed6cefbda","Type":"ContainerDied","Data":"8d717d82a2740ec4060ca75c87c2a9ea54822b82b1ca4a0b1e192b27dbd9d9a1"} Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.759296 4594 generic.go:334] "Generic (PLEG): container finished" podID="92b184e0-f52c-47d3-8488-b9e8e8552b1c" containerID="edac931bbed8944e474eacc46259cf62ee3a21806a5578f51b5bdcc0546efb57" exitCode=0 Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.759379 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pj4xh" event={"ID":"92b184e0-f52c-47d3-8488-b9e8e8552b1c","Type":"ContainerDied","Data":"edac931bbed8944e474eacc46259cf62ee3a21806a5578f51b5bdcc0546efb57"} Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.759408 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pj4xh" event={"ID":"92b184e0-f52c-47d3-8488-b9e8e8552b1c","Type":"ContainerStarted","Data":"6da9781025d7e628f955cb90657769c41e4abf9b4387cf18eba1f619e3be878f"} Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.770854 4594 generic.go:334] "Generic (PLEG): container finished" podID="ada5c910-bb44-43e9-9589-649f2226ebab" containerID="f27d8c0edefcd2ff26a6c5af4dd9f23d94369e78e08cd9af2e8f886dfd38bad7" exitCode=0 Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.770954 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6990-account-create-update-w9h2r" event={"ID":"ada5c910-bb44-43e9-9589-649f2226ebab","Type":"ContainerDied","Data":"f27d8c0edefcd2ff26a6c5af4dd9f23d94369e78e08cd9af2e8f886dfd38bad7"} Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.774977 4594 generic.go:334] "Generic (PLEG): container finished" podID="bbfcad00-10e0-4501-b728-449a2a2f03b0" containerID="276a56701b377bfceb696b0d8e95eaa28edb6a9e217c8abd01583090c35129c8" exitCode=0 Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.775082 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7b8a-account-create-update-r8lrn" event={"ID":"bbfcad00-10e0-4501-b728-449a2a2f03b0","Type":"ContainerDied","Data":"276a56701b377bfceb696b0d8e95eaa28edb6a9e217c8abd01583090c35129c8"} Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.779616 4594 generic.go:334] "Generic (PLEG): container finished" podID="2abc53c5-05f4-457c-be49-c34e962a9522" containerID="123ee440b039bf270d01cc85c87b7aeed13611fcf6da1c0ca3070508a75c318c" exitCode=0 Nov 29 05:46:08 crc kubenswrapper[4594]: I1129 05:46:08.779786 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b707-account-create-update-x94hq" event={"ID":"2abc53c5-05f4-457c-be49-c34e962a9522","Type":"ContainerDied","Data":"123ee440b039bf270d01cc85c87b7aeed13611fcf6da1c0ca3070508a75c318c"} Nov 29 05:46:09 crc kubenswrapper[4594]: I1129 05:46:09.791264 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353","Type":"ContainerStarted","Data":"7cc0989b780c16653bb2ac16babf1d27f470d54ac007ef271f44832f65429d83"} Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.197089 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dmbbc" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.372646 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxjd5\" (UniqueName: \"kubernetes.io/projected/3371a852-7321-4c19-896c-a31265ebf283-kube-api-access-cxjd5\") pod \"3371a852-7321-4c19-896c-a31265ebf283\" (UID: \"3371a852-7321-4c19-896c-a31265ebf283\") " Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.373017 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3371a852-7321-4c19-896c-a31265ebf283-operator-scripts\") pod \"3371a852-7321-4c19-896c-a31265ebf283\" (UID: \"3371a852-7321-4c19-896c-a31265ebf283\") " Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.373623 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3371a852-7321-4c19-896c-a31265ebf283-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3371a852-7321-4c19-896c-a31265ebf283" (UID: "3371a852-7321-4c19-896c-a31265ebf283"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.376685 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3371a852-7321-4c19-896c-a31265ebf283-kube-api-access-cxjd5" (OuterVolumeSpecName: "kube-api-access-cxjd5") pod "3371a852-7321-4c19-896c-a31265ebf283" (UID: "3371a852-7321-4c19-896c-a31265ebf283"). InnerVolumeSpecName "kube-api-access-cxjd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.396397 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2btnd" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.405555 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6990-account-create-update-w9h2r" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.408992 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b707-account-create-update-x94hq" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.422419 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pj4xh" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.438611 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7b8a-account-create-update-r8lrn" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.474863 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f3d7959-7013-4279-8173-077ed6cefbda-operator-scripts\") pod \"5f3d7959-7013-4279-8173-077ed6cefbda\" (UID: \"5f3d7959-7013-4279-8173-077ed6cefbda\") " Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.475094 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92b184e0-f52c-47d3-8488-b9e8e8552b1c-operator-scripts\") pod \"92b184e0-f52c-47d3-8488-b9e8e8552b1c\" (UID: \"92b184e0-f52c-47d3-8488-b9e8e8552b1c\") " Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.475502 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f3d7959-7013-4279-8173-077ed6cefbda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f3d7959-7013-4279-8173-077ed6cefbda" (UID: "5f3d7959-7013-4279-8173-077ed6cefbda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.475529 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92b184e0-f52c-47d3-8488-b9e8e8552b1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92b184e0-f52c-47d3-8488-b9e8e8552b1c" (UID: "92b184e0-f52c-47d3-8488-b9e8e8552b1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.478611 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2abc53c5-05f4-457c-be49-c34e962a9522-operator-scripts\") pod \"2abc53c5-05f4-457c-be49-c34e962a9522\" (UID: \"2abc53c5-05f4-457c-be49-c34e962a9522\") " Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.478744 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada5c910-bb44-43e9-9589-649f2226ebab-operator-scripts\") pod \"ada5c910-bb44-43e9-9589-649f2226ebab\" (UID: \"ada5c910-bb44-43e9-9589-649f2226ebab\") " Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.479021 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abc53c5-05f4-457c-be49-c34e962a9522-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2abc53c5-05f4-457c-be49-c34e962a9522" (UID: "2abc53c5-05f4-457c-be49-c34e962a9522"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.480454 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz4qr\" (UniqueName: \"kubernetes.io/projected/bbfcad00-10e0-4501-b728-449a2a2f03b0-kube-api-access-sz4qr\") pod \"bbfcad00-10e0-4501-b728-449a2a2f03b0\" (UID: \"bbfcad00-10e0-4501-b728-449a2a2f03b0\") " Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.480567 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq7gd\" (UniqueName: \"kubernetes.io/projected/92b184e0-f52c-47d3-8488-b9e8e8552b1c-kube-api-access-fq7gd\") pod \"92b184e0-f52c-47d3-8488-b9e8e8552b1c\" (UID: \"92b184e0-f52c-47d3-8488-b9e8e8552b1c\") " Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.480693 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxjc4\" (UniqueName: \"kubernetes.io/projected/5f3d7959-7013-4279-8173-077ed6cefbda-kube-api-access-vxjc4\") pod \"5f3d7959-7013-4279-8173-077ed6cefbda\" (UID: \"5f3d7959-7013-4279-8173-077ed6cefbda\") " Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.480922 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbfcad00-10e0-4501-b728-449a2a2f03b0-operator-scripts\") pod \"bbfcad00-10e0-4501-b728-449a2a2f03b0\" (UID: \"bbfcad00-10e0-4501-b728-449a2a2f03b0\") " Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.481069 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26rb2\" (UniqueName: \"kubernetes.io/projected/2abc53c5-05f4-457c-be49-c34e962a9522-kube-api-access-26rb2\") pod \"2abc53c5-05f4-457c-be49-c34e962a9522\" (UID: \"2abc53c5-05f4-457c-be49-c34e962a9522\") " Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.481216 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdnlh\" (UniqueName: \"kubernetes.io/projected/ada5c910-bb44-43e9-9589-649f2226ebab-kube-api-access-mdnlh\") pod \"ada5c910-bb44-43e9-9589-649f2226ebab\" (UID: \"ada5c910-bb44-43e9-9589-649f2226ebab\") " Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.481021 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada5c910-bb44-43e9-9589-649f2226ebab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ada5c910-bb44-43e9-9589-649f2226ebab" (UID: "ada5c910-bb44-43e9-9589-649f2226ebab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.482083 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3371a852-7321-4c19-896c-a31265ebf283-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.482350 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2abc53c5-05f4-457c-be49-c34e962a9522-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.482424 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ada5c910-bb44-43e9-9589-649f2226ebab-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.482550 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxjd5\" (UniqueName: \"kubernetes.io/projected/3371a852-7321-4c19-896c-a31265ebf283-kube-api-access-cxjd5\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.482611 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f3d7959-7013-4279-8173-077ed6cefbda-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.482663 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92b184e0-f52c-47d3-8488-b9e8e8552b1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.482956 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbfcad00-10e0-4501-b728-449a2a2f03b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bbfcad00-10e0-4501-b728-449a2a2f03b0" (UID: "bbfcad00-10e0-4501-b728-449a2a2f03b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.485833 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbfcad00-10e0-4501-b728-449a2a2f03b0-kube-api-access-sz4qr" (OuterVolumeSpecName: "kube-api-access-sz4qr") pod "bbfcad00-10e0-4501-b728-449a2a2f03b0" (UID: "bbfcad00-10e0-4501-b728-449a2a2f03b0"). InnerVolumeSpecName "kube-api-access-sz4qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.485927 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f3d7959-7013-4279-8173-077ed6cefbda-kube-api-access-vxjc4" (OuterVolumeSpecName: "kube-api-access-vxjc4") pod "5f3d7959-7013-4279-8173-077ed6cefbda" (UID: "5f3d7959-7013-4279-8173-077ed6cefbda"). InnerVolumeSpecName "kube-api-access-vxjc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.486623 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada5c910-bb44-43e9-9589-649f2226ebab-kube-api-access-mdnlh" (OuterVolumeSpecName: "kube-api-access-mdnlh") pod "ada5c910-bb44-43e9-9589-649f2226ebab" (UID: "ada5c910-bb44-43e9-9589-649f2226ebab"). InnerVolumeSpecName "kube-api-access-mdnlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.488159 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2abc53c5-05f4-457c-be49-c34e962a9522-kube-api-access-26rb2" (OuterVolumeSpecName: "kube-api-access-26rb2") pod "2abc53c5-05f4-457c-be49-c34e962a9522" (UID: "2abc53c5-05f4-457c-be49-c34e962a9522"). InnerVolumeSpecName "kube-api-access-26rb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.490804 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92b184e0-f52c-47d3-8488-b9e8e8552b1c-kube-api-access-fq7gd" (OuterVolumeSpecName: "kube-api-access-fq7gd") pod "92b184e0-f52c-47d3-8488-b9e8e8552b1c" (UID: "92b184e0-f52c-47d3-8488-b9e8e8552b1c"). InnerVolumeSpecName "kube-api-access-fq7gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.584144 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz4qr\" (UniqueName: \"kubernetes.io/projected/bbfcad00-10e0-4501-b728-449a2a2f03b0-kube-api-access-sz4qr\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.584175 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq7gd\" (UniqueName: \"kubernetes.io/projected/92b184e0-f52c-47d3-8488-b9e8e8552b1c-kube-api-access-fq7gd\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.584191 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxjc4\" (UniqueName: \"kubernetes.io/projected/5f3d7959-7013-4279-8173-077ed6cefbda-kube-api-access-vxjc4\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.584203 4594 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbfcad00-10e0-4501-b728-449a2a2f03b0-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.584216 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26rb2\" (UniqueName: \"kubernetes.io/projected/2abc53c5-05f4-457c-be49-c34e962a9522-kube-api-access-26rb2\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.584226 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdnlh\" (UniqueName: \"kubernetes.io/projected/ada5c910-bb44-43e9-9589-649f2226ebab-kube-api-access-mdnlh\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.804603 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b707-account-create-update-x94hq" event={"ID":"2abc53c5-05f4-457c-be49-c34e962a9522","Type":"ContainerDied","Data":"b27b179c36ec661b3f54dbca2dec6ecf54ff5936be342dac71311cb2e27da215"} Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.804668 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b27b179c36ec661b3f54dbca2dec6ecf54ff5936be342dac71311cb2e27da215" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.804753 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b707-account-create-update-x94hq" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.812440 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dmbbc" event={"ID":"3371a852-7321-4c19-896c-a31265ebf283","Type":"ContainerDied","Data":"e43161f6527cc43f14ee40aebed41e15ff2773ecf3ba6fe240b7d4ba71820e88"} Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.812508 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e43161f6527cc43f14ee40aebed41e15ff2773ecf3ba6fe240b7d4ba71820e88" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.812631 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dmbbc" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.820636 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353","Type":"ContainerStarted","Data":"83d247213af92af5ac284728ceeab266aa90ddd1c9ceb9b0fa12cb37ed4c0650"} Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.820861 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="ceilometer-central-agent" containerID="cri-o://88fb822152d931be0cec289aa8482d7f3c67e5ce6aad97d8be04955120761479" gracePeriod=30 Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.821153 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.821536 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="proxy-httpd" containerID="cri-o://83d247213af92af5ac284728ceeab266aa90ddd1c9ceb9b0fa12cb37ed4c0650" gracePeriod=30 Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.821646 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="sg-core" containerID="cri-o://7cc0989b780c16653bb2ac16babf1d27f470d54ac007ef271f44832f65429d83" gracePeriod=30 Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.821627 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="ceilometer-notification-agent" containerID="cri-o://3ad7251e66ba279f9ba73e88cc0726bdeae4e9f177beced32cda601809e8f140" gracePeriod=30 Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.827582 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2btnd" event={"ID":"5f3d7959-7013-4279-8173-077ed6cefbda","Type":"ContainerDied","Data":"4f5ba5d3e446ebf2f2b1334764f72d80a4e8907b61e38444cd339bc9f681ab5a"} Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.827735 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f5ba5d3e446ebf2f2b1334764f72d80a4e8907b61e38444cd339bc9f681ab5a" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.827818 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2btnd" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.847054 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pj4xh" event={"ID":"92b184e0-f52c-47d3-8488-b9e8e8552b1c","Type":"ContainerDied","Data":"6da9781025d7e628f955cb90657769c41e4abf9b4387cf18eba1f619e3be878f"} Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.847092 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6da9781025d7e628f955cb90657769c41e4abf9b4387cf18eba1f619e3be878f" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.847281 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pj4xh" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.857178 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.094343144 podStartE2EDuration="10.85715237s" podCreationTimestamp="2025-11-29 05:46:00 +0000 UTC" firstStartedPulling="2025-11-29 05:46:01.543941455 +0000 UTC m=+1085.784450675" lastFinishedPulling="2025-11-29 05:46:10.306750681 +0000 UTC m=+1094.547259901" observedRunningTime="2025-11-29 05:46:10.847808753 +0000 UTC m=+1095.088317974" watchObservedRunningTime="2025-11-29 05:46:10.85715237 +0000 UTC m=+1095.097661591" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.858050 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6990-account-create-update-w9h2r" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.858082 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6990-account-create-update-w9h2r" event={"ID":"ada5c910-bb44-43e9-9589-649f2226ebab","Type":"ContainerDied","Data":"a641bb97477180bb25c030ce487677eb690faf0269627e27560ca25e359cf991"} Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.858154 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a641bb97477180bb25c030ce487677eb690faf0269627e27560ca25e359cf991" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.860700 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7b8a-account-create-update-r8lrn" event={"ID":"bbfcad00-10e0-4501-b728-449a2a2f03b0","Type":"ContainerDied","Data":"d4888b143f7f8f781618dbf15a3c945977bec7ad3b4f18467fe8c26c46638ab6"} Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.860740 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4888b143f7f8f781618dbf15a3c945977bec7ad3b4f18467fe8c26c46638ab6" Nov 29 05:46:10 crc kubenswrapper[4594]: I1129 05:46:10.860796 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7b8a-account-create-update-r8lrn" Nov 29 05:46:11 crc kubenswrapper[4594]: E1129 05:46:11.013932 4594 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2abc53c5_05f4_457c_be49_c34e962a9522.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3371a852_7321_4c19_896c_a31265ebf283.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbfcad00_10e0_4501_b728_449a2a2f03b0.slice/crio-d4888b143f7f8f781618dbf15a3c945977bec7ad3b4f18467fe8c26c46638ab6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdeb9d7d_ae93_49e5_a3f2_128a4cc17353.slice/crio-conmon-7cc0989b780c16653bb2ac16babf1d27f470d54ac007ef271f44832f65429d83.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3371a852_7321_4c19_896c_a31265ebf283.slice/crio-e43161f6527cc43f14ee40aebed41e15ff2773ecf3ba6fe240b7d4ba71820e88\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2abc53c5_05f4_457c_be49_c34e962a9522.slice/crio-b27b179c36ec661b3f54dbca2dec6ecf54ff5936be342dac71311cb2e27da215\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92b184e0_f52c_47d3_8488_b9e8e8552b1c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdeb9d7d_ae93_49e5_a3f2_128a4cc17353.slice/crio-7cc0989b780c16653bb2ac16babf1d27f470d54ac007ef271f44832f65429d83.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f3d7959_7013_4279_8173_077ed6cefbda.slice\": RecentStats: unable to find data in memory cache]" Nov 29 05:46:11 crc kubenswrapper[4594]: I1129 05:46:11.874393 4594 generic.go:334] "Generic (PLEG): container finished" podID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerID="83d247213af92af5ac284728ceeab266aa90ddd1c9ceb9b0fa12cb37ed4c0650" exitCode=0 Nov 29 05:46:11 crc kubenswrapper[4594]: I1129 05:46:11.874736 4594 generic.go:334] "Generic (PLEG): container finished" podID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerID="7cc0989b780c16653bb2ac16babf1d27f470d54ac007ef271f44832f65429d83" exitCode=2 Nov 29 05:46:11 crc kubenswrapper[4594]: I1129 05:46:11.874746 4594 generic.go:334] "Generic (PLEG): container finished" podID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerID="3ad7251e66ba279f9ba73e88cc0726bdeae4e9f177beced32cda601809e8f140" exitCode=0 Nov 29 05:46:11 crc kubenswrapper[4594]: I1129 05:46:11.874516 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353","Type":"ContainerDied","Data":"83d247213af92af5ac284728ceeab266aa90ddd1c9ceb9b0fa12cb37ed4c0650"} Nov 29 05:46:11 crc kubenswrapper[4594]: I1129 05:46:11.874808 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353","Type":"ContainerDied","Data":"7cc0989b780c16653bb2ac16babf1d27f470d54ac007ef271f44832f65429d83"} Nov 29 05:46:11 crc kubenswrapper[4594]: I1129 05:46:11.874826 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353","Type":"ContainerDied","Data":"3ad7251e66ba279f9ba73e88cc0726bdeae4e9f177beced32cda601809e8f140"} Nov 29 05:46:14 crc kubenswrapper[4594]: I1129 05:46:14.325834 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:14 crc kubenswrapper[4594]: I1129 05:46:14.362168 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:14 crc kubenswrapper[4594]: I1129 05:46:14.917232 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.123220 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.163948 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.667241 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p6cqb"] Nov 29 05:46:15 crc kubenswrapper[4594]: E1129 05:46:15.667682 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b184e0-f52c-47d3-8488-b9e8e8552b1c" containerName="mariadb-database-create" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.667696 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b184e0-f52c-47d3-8488-b9e8e8552b1c" containerName="mariadb-database-create" Nov 29 05:46:15 crc kubenswrapper[4594]: E1129 05:46:15.667711 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3371a852-7321-4c19-896c-a31265ebf283" containerName="mariadb-database-create" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.667716 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3371a852-7321-4c19-896c-a31265ebf283" containerName="mariadb-database-create" Nov 29 05:46:15 crc kubenswrapper[4594]: E1129 05:46:15.667730 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3d7959-7013-4279-8173-077ed6cefbda" containerName="mariadb-database-create" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.667736 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3d7959-7013-4279-8173-077ed6cefbda" containerName="mariadb-database-create" Nov 29 05:46:15 crc kubenswrapper[4594]: E1129 05:46:15.667750 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada5c910-bb44-43e9-9589-649f2226ebab" containerName="mariadb-account-create-update" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.667758 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada5c910-bb44-43e9-9589-649f2226ebab" containerName="mariadb-account-create-update" Nov 29 05:46:15 crc kubenswrapper[4594]: E1129 05:46:15.667779 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbfcad00-10e0-4501-b728-449a2a2f03b0" containerName="mariadb-account-create-update" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.667785 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbfcad00-10e0-4501-b728-449a2a2f03b0" containerName="mariadb-account-create-update" Nov 29 05:46:15 crc kubenswrapper[4594]: E1129 05:46:15.667804 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abc53c5-05f4-457c-be49-c34e962a9522" containerName="mariadb-account-create-update" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.667809 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abc53c5-05f4-457c-be49-c34e962a9522" containerName="mariadb-account-create-update" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.668006 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f3d7959-7013-4279-8173-077ed6cefbda" containerName="mariadb-database-create" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.668021 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada5c910-bb44-43e9-9589-649f2226ebab" containerName="mariadb-account-create-update" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.668029 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="92b184e0-f52c-47d3-8488-b9e8e8552b1c" containerName="mariadb-database-create" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.668036 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abc53c5-05f4-457c-be49-c34e962a9522" containerName="mariadb-account-create-update" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.668046 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbfcad00-10e0-4501-b728-449a2a2f03b0" containerName="mariadb-account-create-update" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.668062 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="3371a852-7321-4c19-896c-a31265ebf283" containerName="mariadb-database-create" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.668763 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.670477 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-m55vf" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.670810 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.671448 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.678181 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p6cqb"] Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.801656 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p6cqb\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.801718 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5t9q\" (UniqueName: \"kubernetes.io/projected/de4685b4-8bf9-4a38-ba5e-7062994790c8-kube-api-access-m5t9q\") pod \"nova-cell0-conductor-db-sync-p6cqb\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.801844 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-scripts\") pod \"nova-cell0-conductor-db-sync-p6cqb\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.801890 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-config-data\") pod \"nova-cell0-conductor-db-sync-p6cqb\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.904409 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-config-data\") pod \"nova-cell0-conductor-db-sync-p6cqb\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.904663 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p6cqb\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.904804 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5t9q\" (UniqueName: \"kubernetes.io/projected/de4685b4-8bf9-4a38-ba5e-7062994790c8-kube-api-access-m5t9q\") pod \"nova-cell0-conductor-db-sync-p6cqb\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.905005 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-scripts\") pod \"nova-cell0-conductor-db-sync-p6cqb\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.911905 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-scripts\") pod \"nova-cell0-conductor-db-sync-p6cqb\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.911941 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p6cqb\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.918909 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-config-data\") pod \"nova-cell0-conductor-db-sync-p6cqb\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.922436 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5t9q\" (UniqueName: \"kubernetes.io/projected/de4685b4-8bf9-4a38-ba5e-7062994790c8-kube-api-access-m5t9q\") pod \"nova-cell0-conductor-db-sync-p6cqb\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:15 crc kubenswrapper[4594]: I1129 05:46:15.987430 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:16 crc kubenswrapper[4594]: I1129 05:46:16.397193 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p6cqb"] Nov 29 05:46:16 crc kubenswrapper[4594]: I1129 05:46:16.942016 4594 generic.go:334] "Generic (PLEG): container finished" podID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerID="88fb822152d931be0cec289aa8482d7f3c67e5ce6aad97d8be04955120761479" exitCode=0 Nov 29 05:46:16 crc kubenswrapper[4594]: I1129 05:46:16.942129 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353","Type":"ContainerDied","Data":"88fb822152d931be0cec289aa8482d7f3c67e5ce6aad97d8be04955120761479"} Nov 29 05:46:16 crc kubenswrapper[4594]: I1129 05:46:16.943275 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353","Type":"ContainerDied","Data":"6e456579ee9080aa5f16fbcea1e1244ad9f63f2e5ce1780879746d6ae436a970"} Nov 29 05:46:16 crc kubenswrapper[4594]: I1129 05:46:16.943357 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e456579ee9080aa5f16fbcea1e1244ad9f63f2e5ce1780879746d6ae436a970" Nov 29 05:46:16 crc kubenswrapper[4594]: I1129 05:46:16.944692 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p6cqb" event={"ID":"de4685b4-8bf9-4a38-ba5e-7062994790c8","Type":"ContainerStarted","Data":"4c35ab24a9357f350454b41fca8d92d74b3166e08c5799be0a16433bffaeae77"} Nov 29 05:46:16 crc kubenswrapper[4594]: I1129 05:46:16.944918 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerName="watcher-decision-engine" containerID="cri-o://3629936ba22e800f31489d08bd273609b38199d12a0e515a370b697d753a24b2" gracePeriod=30 Nov 29 05:46:16 crc kubenswrapper[4594]: I1129 05:46:16.989087 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.036075 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-combined-ca-bundle\") pod \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.036338 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-log-httpd\") pod \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.037101 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd9bh\" (UniqueName: \"kubernetes.io/projected/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-kube-api-access-jd9bh\") pod \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.037726 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-scripts\") pod \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.037858 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-sg-core-conf-yaml\") pod \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.037942 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-config-data\") pod \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.037041 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" (UID: "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.040854 4594 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.044867 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-kube-api-access-jd9bh" (OuterVolumeSpecName: "kube-api-access-jd9bh") pod "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" (UID: "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353"). InnerVolumeSpecName "kube-api-access-jd9bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.046616 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-scripts" (OuterVolumeSpecName: "scripts") pod "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" (UID: "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.071338 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" (UID: "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.107809 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" (UID: "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.137225 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-config-data" (OuterVolumeSpecName: "config-data") pod "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" (UID: "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.141712 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-run-httpd\") pod \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\" (UID: \"bdeb9d7d-ae93-49e5-a3f2-128a4cc17353\") " Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.141975 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" (UID: "bdeb9d7d-ae93-49e5-a3f2-128a4cc17353"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.142348 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.142370 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd9bh\" (UniqueName: \"kubernetes.io/projected/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-kube-api-access-jd9bh\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.142384 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.142393 4594 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.142399 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.142408 4594 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:17 crc kubenswrapper[4594]: I1129 05:46:17.960998 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.000426 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.019409 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.063683 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:18 crc kubenswrapper[4594]: E1129 05:46:18.064965 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="sg-core" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.065090 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="sg-core" Nov 29 05:46:18 crc kubenswrapper[4594]: E1129 05:46:18.065177 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="ceilometer-notification-agent" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.065236 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="ceilometer-notification-agent" Nov 29 05:46:18 crc kubenswrapper[4594]: E1129 05:46:18.065314 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="proxy-httpd" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.065600 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="proxy-httpd" Nov 29 05:46:18 crc kubenswrapper[4594]: E1129 05:46:18.065715 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="ceilometer-central-agent" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.065765 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="ceilometer-central-agent" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.066135 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="proxy-httpd" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.066218 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="sg-core" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.066287 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="ceilometer-central-agent" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.066341 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" containerName="ceilometer-notification-agent" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.069038 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.072732 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.073291 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.100139 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdeb9d7d-ae93-49e5-a3f2-128a4cc17353" path="/var/lib/kubelet/pods/bdeb9d7d-ae93-49e5-a3f2-128a4cc17353/volumes" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.100890 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.166076 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.166170 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-config-data\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.166195 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-629m8\" (UniqueName: \"kubernetes.io/projected/c12c56f3-363e-4998-9c43-51d5903ba7fe-kube-api-access-629m8\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.166429 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c12c56f3-363e-4998-9c43-51d5903ba7fe-log-httpd\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.166540 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-scripts\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.166662 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c12c56f3-363e-4998-9c43-51d5903ba7fe-run-httpd\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.166911 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.269206 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c12c56f3-363e-4998-9c43-51d5903ba7fe-run-httpd\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.269332 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.269394 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.269435 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-config-data\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.269455 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-629m8\" (UniqueName: \"kubernetes.io/projected/c12c56f3-363e-4998-9c43-51d5903ba7fe-kube-api-access-629m8\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.269530 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c12c56f3-363e-4998-9c43-51d5903ba7fe-log-httpd\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.269559 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-scripts\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.269709 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c12c56f3-363e-4998-9c43-51d5903ba7fe-run-httpd\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.270431 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c12c56f3-363e-4998-9c43-51d5903ba7fe-log-httpd\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.275810 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-scripts\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.276502 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-config-data\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.292993 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.293030 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.296926 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-629m8\" (UniqueName: \"kubernetes.io/projected/c12c56f3-363e-4998-9c43-51d5903ba7fe-kube-api-access-629m8\") pod \"ceilometer-0\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.395482 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.412665 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.826365 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:18 crc kubenswrapper[4594]: I1129 05:46:18.979113 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c12c56f3-363e-4998-9c43-51d5903ba7fe","Type":"ContainerStarted","Data":"03337fcc1359bbf8a03919e9740cd24d8d5dc69761d2a63511137bd6c61c7820"} Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.053595 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.053843 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="315c99c4-4c5f-4bff-9ff0-116d4e7bf846" containerName="glance-log" containerID="cri-o://50aa13ccfea274f02bf9c5b2b2aa6e0ecf276429506012a42ba955bff82ae8f3" gracePeriod=30 Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.054179 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="315c99c4-4c5f-4bff-9ff0-116d4e7bf846" containerName="glance-httpd" containerID="cri-o://53870e54fb8d4798b9443c1d05850e7f7e5082e89e538969f556b5928f4472ad" gracePeriod=30 Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.628680 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.694453 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-combined-ca-bundle\") pod \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.694803 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f55j2\" (UniqueName: \"kubernetes.io/projected/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-kube-api-access-f55j2\") pod \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.694839 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-logs\") pod \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.694861 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-config-data\") pod \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.695579 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-custom-prometheus-ca\") pod \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\" (UID: \"3d60aa94-3720-43b5-a5de-ca30dd1b63b2\") " Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.696273 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-logs" (OuterVolumeSpecName: "logs") pod "3d60aa94-3720-43b5-a5de-ca30dd1b63b2" (UID: "3d60aa94-3720-43b5-a5de-ca30dd1b63b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.696868 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.699156 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-kube-api-access-f55j2" (OuterVolumeSpecName: "kube-api-access-f55j2") pod "3d60aa94-3720-43b5-a5de-ca30dd1b63b2" (UID: "3d60aa94-3720-43b5-a5de-ca30dd1b63b2"). InnerVolumeSpecName "kube-api-access-f55j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.724986 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "3d60aa94-3720-43b5-a5de-ca30dd1b63b2" (UID: "3d60aa94-3720-43b5-a5de-ca30dd1b63b2"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.729377 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d60aa94-3720-43b5-a5de-ca30dd1b63b2" (UID: "3d60aa94-3720-43b5-a5de-ca30dd1b63b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.742873 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-config-data" (OuterVolumeSpecName: "config-data") pod "3d60aa94-3720-43b5-a5de-ca30dd1b63b2" (UID: "3d60aa94-3720-43b5-a5de-ca30dd1b63b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.799310 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f55j2\" (UniqueName: \"kubernetes.io/projected/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-kube-api-access-f55j2\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.799357 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.799371 4594 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.799381 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d60aa94-3720-43b5-a5de-ca30dd1b63b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.995157 4594 generic.go:334] "Generic (PLEG): container finished" podID="315c99c4-4c5f-4bff-9ff0-116d4e7bf846" containerID="53870e54fb8d4798b9443c1d05850e7f7e5082e89e538969f556b5928f4472ad" exitCode=0 Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.995198 4594 generic.go:334] "Generic (PLEG): container finished" podID="315c99c4-4c5f-4bff-9ff0-116d4e7bf846" containerID="50aa13ccfea274f02bf9c5b2b2aa6e0ecf276429506012a42ba955bff82ae8f3" exitCode=143 Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.995283 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"315c99c4-4c5f-4bff-9ff0-116d4e7bf846","Type":"ContainerDied","Data":"53870e54fb8d4798b9443c1d05850e7f7e5082e89e538969f556b5928f4472ad"} Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.995321 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"315c99c4-4c5f-4bff-9ff0-116d4e7bf846","Type":"ContainerDied","Data":"50aa13ccfea274f02bf9c5b2b2aa6e0ecf276429506012a42ba955bff82ae8f3"} Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.998438 4594 generic.go:334] "Generic (PLEG): container finished" podID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerID="3629936ba22e800f31489d08bd273609b38199d12a0e515a370b697d753a24b2" exitCode=0 Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.998505 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d60aa94-3720-43b5-a5de-ca30dd1b63b2","Type":"ContainerDied","Data":"3629936ba22e800f31489d08bd273609b38199d12a0e515a370b697d753a24b2"} Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.998529 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d60aa94-3720-43b5-a5de-ca30dd1b63b2","Type":"ContainerDied","Data":"2b246cf4365743fd554a7900f9a1edf20d90a3e0837ed42e1a78a4c5266dd7c0"} Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.998550 4594 scope.go:117] "RemoveContainer" containerID="3629936ba22e800f31489d08bd273609b38199d12a0e515a370b697d753a24b2" Nov 29 05:46:19 crc kubenswrapper[4594]: I1129 05:46:19.998550 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.001362 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c12c56f3-363e-4998-9c43-51d5903ba7fe","Type":"ContainerStarted","Data":"e4049b46aed09ed3eee3598ff47657ab9a7b329b93d5c7868e50be98d61c688c"} Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.046321 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.062851 4594 scope.go:117] "RemoveContainer" containerID="327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.063659 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.071591 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Nov 29 05:46:20 crc kubenswrapper[4594]: E1129 05:46:20.072120 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerName="watcher-decision-engine" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.072140 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerName="watcher-decision-engine" Nov 29 05:46:20 crc kubenswrapper[4594]: E1129 05:46:20.072153 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerName="watcher-decision-engine" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.072159 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerName="watcher-decision-engine" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.072364 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerName="watcher-decision-engine" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.072390 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerName="watcher-decision-engine" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.072400 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerName="watcher-decision-engine" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.073117 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.076327 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.105065 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716cbd33-cb95-4be2-a9c9-98c742ee4e17-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.105154 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716cbd33-cb95-4be2-a9c9-98c742ee4e17-config-data\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.105182 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4pl8\" (UniqueName: \"kubernetes.io/projected/716cbd33-cb95-4be2-a9c9-98c742ee4e17-kube-api-access-p4pl8\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.105218 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/716cbd33-cb95-4be2-a9c9-98c742ee4e17-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.105245 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/716cbd33-cb95-4be2-a9c9-98c742ee4e17-logs\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.106242 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" path="/var/lib/kubelet/pods/3d60aa94-3720-43b5-a5de-ca30dd1b63b2/volumes" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.106852 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.155208 4594 scope.go:117] "RemoveContainer" containerID="3629936ba22e800f31489d08bd273609b38199d12a0e515a370b697d753a24b2" Nov 29 05:46:20 crc kubenswrapper[4594]: E1129 05:46:20.157341 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3629936ba22e800f31489d08bd273609b38199d12a0e515a370b697d753a24b2\": container with ID starting with 3629936ba22e800f31489d08bd273609b38199d12a0e515a370b697d753a24b2 not found: ID does not exist" containerID="3629936ba22e800f31489d08bd273609b38199d12a0e515a370b697d753a24b2" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.157390 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3629936ba22e800f31489d08bd273609b38199d12a0e515a370b697d753a24b2"} err="failed to get container status \"3629936ba22e800f31489d08bd273609b38199d12a0e515a370b697d753a24b2\": rpc error: code = NotFound desc = could not find container \"3629936ba22e800f31489d08bd273609b38199d12a0e515a370b697d753a24b2\": container with ID starting with 3629936ba22e800f31489d08bd273609b38199d12a0e515a370b697d753a24b2 not found: ID does not exist" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.157415 4594 scope.go:117] "RemoveContainer" containerID="327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5" Nov 29 05:46:20 crc kubenswrapper[4594]: E1129 05:46:20.157756 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5\": container with ID starting with 327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5 not found: ID does not exist" containerID="327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.157790 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5"} err="failed to get container status \"327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5\": rpc error: code = NotFound desc = could not find container \"327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5\": container with ID starting with 327ef444e09f45cf2a239393e3df4bf5d931562ffdda0035f22957079e46a5a5 not found: ID does not exist" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.207297 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/716cbd33-cb95-4be2-a9c9-98c742ee4e17-logs\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.207508 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716cbd33-cb95-4be2-a9c9-98c742ee4e17-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.207675 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716cbd33-cb95-4be2-a9c9-98c742ee4e17-config-data\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.207711 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4pl8\" (UniqueName: \"kubernetes.io/projected/716cbd33-cb95-4be2-a9c9-98c742ee4e17-kube-api-access-p4pl8\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.207785 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/716cbd33-cb95-4be2-a9c9-98c742ee4e17-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.208029 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.208294 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="075b6980-e8ff-4248-ad5e-fb33fd7199d3" containerName="glance-log" containerID="cri-o://620cd815098d08aef3c32da10723fa7a38371fa50e7c93cd5df9925729d55756" gracePeriod=30 Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.208722 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="075b6980-e8ff-4248-ad5e-fb33fd7199d3" containerName="glance-httpd" containerID="cri-o://0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c" gracePeriod=30 Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.213013 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716cbd33-cb95-4be2-a9c9-98c742ee4e17-config-data\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.213991 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/716cbd33-cb95-4be2-a9c9-98c742ee4e17-logs\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.217403 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/716cbd33-cb95-4be2-a9c9-98c742ee4e17-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.218076 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716cbd33-cb95-4be2-a9c9-98c742ee4e17-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.235952 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4pl8\" (UniqueName: \"kubernetes.io/projected/716cbd33-cb95-4be2-a9c9-98c742ee4e17-kube-api-access-p4pl8\") pod \"watcher-decision-engine-0\" (UID: \"716cbd33-cb95-4be2-a9c9-98c742ee4e17\") " pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.397508 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.493931 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.522638 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-httpd-run\") pod \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.522882 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-combined-ca-bundle\") pod \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.522970 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-logs\") pod \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.523051 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-public-tls-certs\") pod \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.523140 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "315c99c4-4c5f-4bff-9ff0-116d4e7bf846" (UID: "315c99c4-4c5f-4bff-9ff0-116d4e7bf846"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.523266 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-config-data\") pod \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.523631 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-logs" (OuterVolumeSpecName: "logs") pod "315c99c4-4c5f-4bff-9ff0-116d4e7bf846" (UID: "315c99c4-4c5f-4bff-9ff0-116d4e7bf846"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.523800 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.523952 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blm49\" (UniqueName: \"kubernetes.io/projected/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-kube-api-access-blm49\") pod \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.524070 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-scripts\") pod \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\" (UID: \"315c99c4-4c5f-4bff-9ff0-116d4e7bf846\") " Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.524654 4594 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.524724 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.531976 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "315c99c4-4c5f-4bff-9ff0-116d4e7bf846" (UID: "315c99c4-4c5f-4bff-9ff0-116d4e7bf846"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.533176 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-kube-api-access-blm49" (OuterVolumeSpecName: "kube-api-access-blm49") pod "315c99c4-4c5f-4bff-9ff0-116d4e7bf846" (UID: "315c99c4-4c5f-4bff-9ff0-116d4e7bf846"). InnerVolumeSpecName "kube-api-access-blm49". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.535451 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-scripts" (OuterVolumeSpecName: "scripts") pod "315c99c4-4c5f-4bff-9ff0-116d4e7bf846" (UID: "315c99c4-4c5f-4bff-9ff0-116d4e7bf846"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.627493 4594 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.627524 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blm49\" (UniqueName: \"kubernetes.io/projected/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-kube-api-access-blm49\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.627534 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.650606 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "315c99c4-4c5f-4bff-9ff0-116d4e7bf846" (UID: "315c99c4-4c5f-4bff-9ff0-116d4e7bf846"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.718984 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-config-data" (OuterVolumeSpecName: "config-data") pod "315c99c4-4c5f-4bff-9ff0-116d4e7bf846" (UID: "315c99c4-4c5f-4bff-9ff0-116d4e7bf846"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.725389 4594 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.734127 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.734149 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.734158 4594 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.770355 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "315c99c4-4c5f-4bff-9ff0-116d4e7bf846" (UID: "315c99c4-4c5f-4bff-9ff0-116d4e7bf846"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.836417 4594 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/315c99c4-4c5f-4bff-9ff0-116d4e7bf846-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:20 crc kubenswrapper[4594]: I1129 05:46:20.900963 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Nov 29 05:46:20 crc kubenswrapper[4594]: W1129 05:46:20.905781 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod716cbd33_cb95_4be2_a9c9_98c742ee4e17.slice/crio-c2f9f4a1f7f6b9e75a3bc98cf6fe54d914ee28d421133757f255426e7f1dbe4b WatchSource:0}: Error finding container c2f9f4a1f7f6b9e75a3bc98cf6fe54d914ee28d421133757f255426e7f1dbe4b: Status 404 returned error can't find the container with id c2f9f4a1f7f6b9e75a3bc98cf6fe54d914ee28d421133757f255426e7f1dbe4b Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.014225 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c12c56f3-363e-4998-9c43-51d5903ba7fe","Type":"ContainerStarted","Data":"e23bde88c62dae985d6ce355ec9b121be2e4f9e6056bb8b6525e1f64b9c47a4f"} Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.015597 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"716cbd33-cb95-4be2-a9c9-98c742ee4e17","Type":"ContainerStarted","Data":"c2f9f4a1f7f6b9e75a3bc98cf6fe54d914ee28d421133757f255426e7f1dbe4b"} Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.017815 4594 generic.go:334] "Generic (PLEG): container finished" podID="075b6980-e8ff-4248-ad5e-fb33fd7199d3" containerID="620cd815098d08aef3c32da10723fa7a38371fa50e7c93cd5df9925729d55756" exitCode=143 Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.017857 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"075b6980-e8ff-4248-ad5e-fb33fd7199d3","Type":"ContainerDied","Data":"620cd815098d08aef3c32da10723fa7a38371fa50e7c93cd5df9925729d55756"} Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.019915 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"315c99c4-4c5f-4bff-9ff0-116d4e7bf846","Type":"ContainerDied","Data":"0a1cc8a404d7cf015ad0e183d17c9b05231f2d0deb67012948172391410ebe64"} Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.019939 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.019977 4594 scope.go:117] "RemoveContainer" containerID="53870e54fb8d4798b9443c1d05850e7f7e5082e89e538969f556b5928f4472ad" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.050148 4594 scope.go:117] "RemoveContainer" containerID="50aa13ccfea274f02bf9c5b2b2aa6e0ecf276429506012a42ba955bff82ae8f3" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.055487 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.066940 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.103715 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:46:21 crc kubenswrapper[4594]: E1129 05:46:21.104072 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerName="watcher-decision-engine" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.104086 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerName="watcher-decision-engine" Nov 29 05:46:21 crc kubenswrapper[4594]: E1129 05:46:21.104123 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315c99c4-4c5f-4bff-9ff0-116d4e7bf846" containerName="glance-httpd" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.104129 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="315c99c4-4c5f-4bff-9ff0-116d4e7bf846" containerName="glance-httpd" Nov 29 05:46:21 crc kubenswrapper[4594]: E1129 05:46:21.104150 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315c99c4-4c5f-4bff-9ff0-116d4e7bf846" containerName="glance-log" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.104155 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="315c99c4-4c5f-4bff-9ff0-116d4e7bf846" containerName="glance-log" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.107064 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerName="watcher-decision-engine" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.107109 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="315c99c4-4c5f-4bff-9ff0-116d4e7bf846" containerName="glance-log" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.107121 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="315c99c4-4c5f-4bff-9ff0-116d4e7bf846" containerName="glance-httpd" Nov 29 05:46:21 crc kubenswrapper[4594]: E1129 05:46:21.107317 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerName="watcher-decision-engine" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.107327 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d60aa94-3720-43b5-a5de-ca30dd1b63b2" containerName="watcher-decision-engine" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.108141 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.120592 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.120823 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.138524 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.143209 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfda0b74-99d7-4176-89f9-71d8385ddc6f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.143271 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfda0b74-99d7-4176-89f9-71d8385ddc6f-scripts\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.143440 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.143463 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfda0b74-99d7-4176-89f9-71d8385ddc6f-config-data\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.143491 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfda0b74-99d7-4176-89f9-71d8385ddc6f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.143512 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bfda0b74-99d7-4176-89f9-71d8385ddc6f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.143592 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfda0b74-99d7-4176-89f9-71d8385ddc6f-logs\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.143623 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54242\" (UniqueName: \"kubernetes.io/projected/bfda0b74-99d7-4176-89f9-71d8385ddc6f-kube-api-access-54242\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.246758 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfda0b74-99d7-4176-89f9-71d8385ddc6f-logs\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.246830 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54242\" (UniqueName: \"kubernetes.io/projected/bfda0b74-99d7-4176-89f9-71d8385ddc6f-kube-api-access-54242\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.246899 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfda0b74-99d7-4176-89f9-71d8385ddc6f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.246929 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfda0b74-99d7-4176-89f9-71d8385ddc6f-scripts\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.247123 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.247151 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfda0b74-99d7-4176-89f9-71d8385ddc6f-config-data\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.247185 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfda0b74-99d7-4176-89f9-71d8385ddc6f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.247204 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bfda0b74-99d7-4176-89f9-71d8385ddc6f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.247722 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bfda0b74-99d7-4176-89f9-71d8385ddc6f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.247954 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfda0b74-99d7-4176-89f9-71d8385ddc6f-logs\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.248501 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.278983 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfda0b74-99d7-4176-89f9-71d8385ddc6f-scripts\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.280878 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfda0b74-99d7-4176-89f9-71d8385ddc6f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.285863 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54242\" (UniqueName: \"kubernetes.io/projected/bfda0b74-99d7-4176-89f9-71d8385ddc6f-kube-api-access-54242\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.294351 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfda0b74-99d7-4176-89f9-71d8385ddc6f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.308421 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfda0b74-99d7-4176-89f9-71d8385ddc6f-config-data\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.329770 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"bfda0b74-99d7-4176-89f9-71d8385ddc6f\") " pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.550908 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: E1129 05:46:21.562942 4594 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod075b6980_e8ff_4248_ad5e_fb33fd7199d3.slice/crio-0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c.scope\": RecentStats: unable to find data in memory cache]" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.683583 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.861546 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-internal-tls-certs\") pod \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.861594 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl7r2\" (UniqueName: \"kubernetes.io/projected/075b6980-e8ff-4248-ad5e-fb33fd7199d3-kube-api-access-vl7r2\") pod \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.861632 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/075b6980-e8ff-4248-ad5e-fb33fd7199d3-httpd-run\") pod \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.861684 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-config-data\") pod \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.861730 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.861854 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-scripts\") pod \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.861927 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-combined-ca-bundle\") pod \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.861952 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/075b6980-e8ff-4248-ad5e-fb33fd7199d3-logs\") pod \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\" (UID: \"075b6980-e8ff-4248-ad5e-fb33fd7199d3\") " Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.863332 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/075b6980-e8ff-4248-ad5e-fb33fd7199d3-logs" (OuterVolumeSpecName: "logs") pod "075b6980-e8ff-4248-ad5e-fb33fd7199d3" (UID: "075b6980-e8ff-4248-ad5e-fb33fd7199d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.868621 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/075b6980-e8ff-4248-ad5e-fb33fd7199d3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "075b6980-e8ff-4248-ad5e-fb33fd7199d3" (UID: "075b6980-e8ff-4248-ad5e-fb33fd7199d3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.871868 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "075b6980-e8ff-4248-ad5e-fb33fd7199d3" (UID: "075b6980-e8ff-4248-ad5e-fb33fd7199d3"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.877592 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-scripts" (OuterVolumeSpecName: "scripts") pod "075b6980-e8ff-4248-ad5e-fb33fd7199d3" (UID: "075b6980-e8ff-4248-ad5e-fb33fd7199d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.892478 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/075b6980-e8ff-4248-ad5e-fb33fd7199d3-kube-api-access-vl7r2" (OuterVolumeSpecName: "kube-api-access-vl7r2") pod "075b6980-e8ff-4248-ad5e-fb33fd7199d3" (UID: "075b6980-e8ff-4248-ad5e-fb33fd7199d3"). InnerVolumeSpecName "kube-api-access-vl7r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.905233 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "075b6980-e8ff-4248-ad5e-fb33fd7199d3" (UID: "075b6980-e8ff-4248-ad5e-fb33fd7199d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.925248 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "075b6980-e8ff-4248-ad5e-fb33fd7199d3" (UID: "075b6980-e8ff-4248-ad5e-fb33fd7199d3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.929528 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-config-data" (OuterVolumeSpecName: "config-data") pod "075b6980-e8ff-4248-ad5e-fb33fd7199d3" (UID: "075b6980-e8ff-4248-ad5e-fb33fd7199d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.965669 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.965703 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.965717 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/075b6980-e8ff-4248-ad5e-fb33fd7199d3-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.965729 4594 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.965738 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl7r2\" (UniqueName: \"kubernetes.io/projected/075b6980-e8ff-4248-ad5e-fb33fd7199d3-kube-api-access-vl7r2\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.965746 4594 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/075b6980-e8ff-4248-ad5e-fb33fd7199d3-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.965755 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/075b6980-e8ff-4248-ad5e-fb33fd7199d3-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.965795 4594 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Nov 29 05:46:21 crc kubenswrapper[4594]: I1129 05:46:21.989053 4594 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.034581 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"716cbd33-cb95-4be2-a9c9-98c742ee4e17","Type":"ContainerStarted","Data":"372e22b729f01ed44b946bee61f62d22bca6c4b4494f014511e181dc486b156a"} Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.039042 4594 generic.go:334] "Generic (PLEG): container finished" podID="075b6980-e8ff-4248-ad5e-fb33fd7199d3" containerID="0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c" exitCode=0 Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.039092 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"075b6980-e8ff-4248-ad5e-fb33fd7199d3","Type":"ContainerDied","Data":"0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c"} Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.039112 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"075b6980-e8ff-4248-ad5e-fb33fd7199d3","Type":"ContainerDied","Data":"89d8db4669bcb1fd810e56de63ffe44d8d9010c5458f840b90425f1ed2cad8a1"} Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.039129 4594 scope.go:117] "RemoveContainer" containerID="0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.039200 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.057895 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.057877859 podStartE2EDuration="2.057877859s" podCreationTimestamp="2025-11-29 05:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:46:22.051204037 +0000 UTC m=+1106.291713257" watchObservedRunningTime="2025-11-29 05:46:22.057877859 +0000 UTC m=+1106.298387079" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.063814 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c12c56f3-363e-4998-9c43-51d5903ba7fe","Type":"ContainerStarted","Data":"d94b05b53343aa758629f98158d0a2624ffcbbdf9f14205d7deced63722bb075"} Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.067675 4594 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.096575 4594 scope.go:117] "RemoveContainer" containerID="620cd815098d08aef3c32da10723fa7a38371fa50e7c93cd5df9925729d55756" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.100136 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="315c99c4-4c5f-4bff-9ff0-116d4e7bf846" path="/var/lib/kubelet/pods/315c99c4-4c5f-4bff-9ff0-116d4e7bf846/volumes" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.100940 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.100979 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.108161 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:46:22 crc kubenswrapper[4594]: E1129 05:46:22.108767 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="075b6980-e8ff-4248-ad5e-fb33fd7199d3" containerName="glance-log" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.108799 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="075b6980-e8ff-4248-ad5e-fb33fd7199d3" containerName="glance-log" Nov 29 05:46:22 crc kubenswrapper[4594]: E1129 05:46:22.108822 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="075b6980-e8ff-4248-ad5e-fb33fd7199d3" containerName="glance-httpd" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.108830 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="075b6980-e8ff-4248-ad5e-fb33fd7199d3" containerName="glance-httpd" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.109087 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="075b6980-e8ff-4248-ad5e-fb33fd7199d3" containerName="glance-httpd" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.109109 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="075b6980-e8ff-4248-ad5e-fb33fd7199d3" containerName="glance-log" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.110372 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.112661 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.117333 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.117613 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.194977 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.205648 4594 scope.go:117] "RemoveContainer" containerID="0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c" Nov 29 05:46:22 crc kubenswrapper[4594]: E1129 05:46:22.206437 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c\": container with ID starting with 0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c not found: ID does not exist" containerID="0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.206489 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c"} err="failed to get container status \"0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c\": rpc error: code = NotFound desc = could not find container \"0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c\": container with ID starting with 0b4551be8d886b9f5430577190f3872fbe450cf0f21de3ed167862dda68c926c not found: ID does not exist" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.206524 4594 scope.go:117] "RemoveContainer" containerID="620cd815098d08aef3c32da10723fa7a38371fa50e7c93cd5df9925729d55756" Nov 29 05:46:22 crc kubenswrapper[4594]: E1129 05:46:22.209006 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"620cd815098d08aef3c32da10723fa7a38371fa50e7c93cd5df9925729d55756\": container with ID starting with 620cd815098d08aef3c32da10723fa7a38371fa50e7c93cd5df9925729d55756 not found: ID does not exist" containerID="620cd815098d08aef3c32da10723fa7a38371fa50e7c93cd5df9925729d55756" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.209062 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"620cd815098d08aef3c32da10723fa7a38371fa50e7c93cd5df9925729d55756"} err="failed to get container status \"620cd815098d08aef3c32da10723fa7a38371fa50e7c93cd5df9925729d55756\": rpc error: code = NotFound desc = could not find container \"620cd815098d08aef3c32da10723fa7a38371fa50e7c93cd5df9925729d55756\": container with ID starting with 620cd815098d08aef3c32da10723fa7a38371fa50e7c93cd5df9925729d55756 not found: ID does not exist" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.276211 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.276323 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktlch\" (UniqueName: \"kubernetes.io/projected/9f4133eb-5349-4bad-a993-d4e880a2f1be-kube-api-access-ktlch\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.276359 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4133eb-5349-4bad-a993-d4e880a2f1be-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.276423 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4133eb-5349-4bad-a993-d4e880a2f1be-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.276466 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4133eb-5349-4bad-a993-d4e880a2f1be-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.276492 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f4133eb-5349-4bad-a993-d4e880a2f1be-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.276520 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f4133eb-5349-4bad-a993-d4e880a2f1be-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.276545 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4133eb-5349-4bad-a993-d4e880a2f1be-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.381916 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4133eb-5349-4bad-a993-d4e880a2f1be-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.382018 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4133eb-5349-4bad-a993-d4e880a2f1be-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.382044 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f4133eb-5349-4bad-a993-d4e880a2f1be-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.382084 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f4133eb-5349-4bad-a993-d4e880a2f1be-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.382117 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4133eb-5349-4bad-a993-d4e880a2f1be-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.382161 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.382230 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktlch\" (UniqueName: \"kubernetes.io/projected/9f4133eb-5349-4bad-a993-d4e880a2f1be-kube-api-access-ktlch\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.382287 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4133eb-5349-4bad-a993-d4e880a2f1be-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.382622 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.382729 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f4133eb-5349-4bad-a993-d4e880a2f1be-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.385335 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f4133eb-5349-4bad-a993-d4e880a2f1be-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.392220 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4133eb-5349-4bad-a993-d4e880a2f1be-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.392284 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4133eb-5349-4bad-a993-d4e880a2f1be-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.400028 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4133eb-5349-4bad-a993-d4e880a2f1be-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.405767 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktlch\" (UniqueName: \"kubernetes.io/projected/9f4133eb-5349-4bad-a993-d4e880a2f1be-kube-api-access-ktlch\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.406704 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4133eb-5349-4bad-a993-d4e880a2f1be-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.415195 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f4133eb-5349-4bad-a993-d4e880a2f1be\") " pod="openstack/glance-default-internal-api-0" Nov 29 05:46:22 crc kubenswrapper[4594]: I1129 05:46:22.468856 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 05:46:23 crc kubenswrapper[4594]: I1129 05:46:23.001245 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 05:46:23 crc kubenswrapper[4594]: I1129 05:46:23.075153 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f4133eb-5349-4bad-a993-d4e880a2f1be","Type":"ContainerStarted","Data":"66cf437809c64f781720f0648403b158fc0c072eef00d291750960496bbf80b4"} Nov 29 05:46:23 crc kubenswrapper[4594]: I1129 05:46:23.081030 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bfda0b74-99d7-4176-89f9-71d8385ddc6f","Type":"ContainerStarted","Data":"2b6446190da17e2a1bca2370980d8dfbfe54834ba2d1dfb0a3f9e51d0b4c1cd4"} Nov 29 05:46:23 crc kubenswrapper[4594]: I1129 05:46:23.081056 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bfda0b74-99d7-4176-89f9-71d8385ddc6f","Type":"ContainerStarted","Data":"5d4bd45b166a089c10674daef269091068735487d7f94dabf41f694c947c266e"} Nov 29 05:46:24 crc kubenswrapper[4594]: I1129 05:46:24.105249 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="075b6980-e8ff-4248-ad5e-fb33fd7199d3" path="/var/lib/kubelet/pods/075b6980-e8ff-4248-ad5e-fb33fd7199d3/volumes" Nov 29 05:46:24 crc kubenswrapper[4594]: I1129 05:46:24.106685 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bfda0b74-99d7-4176-89f9-71d8385ddc6f","Type":"ContainerStarted","Data":"60b88a52c7b17a4baa264e5e30e5ecccb8da95fa5cbbff2a85a0968995cba86f"} Nov 29 05:46:24 crc kubenswrapper[4594]: I1129 05:46:24.109485 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c12c56f3-363e-4998-9c43-51d5903ba7fe","Type":"ContainerStarted","Data":"8e18995c55407fd7afa205d990b5c5e332d9faf140edf57e6fa8da0a699b7ec6"} Nov 29 05:46:24 crc kubenswrapper[4594]: I1129 05:46:24.109705 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 05:46:24 crc kubenswrapper[4594]: I1129 05:46:24.109710 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="proxy-httpd" containerID="cri-o://8e18995c55407fd7afa205d990b5c5e332d9faf140edf57e6fa8da0a699b7ec6" gracePeriod=30 Nov 29 05:46:24 crc kubenswrapper[4594]: I1129 05:46:24.109705 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="ceilometer-central-agent" containerID="cri-o://e4049b46aed09ed3eee3598ff47657ab9a7b329b93d5c7868e50be98d61c688c" gracePeriod=30 Nov 29 05:46:24 crc kubenswrapper[4594]: I1129 05:46:24.109786 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="sg-core" containerID="cri-o://d94b05b53343aa758629f98158d0a2624ffcbbdf9f14205d7deced63722bb075" gracePeriod=30 Nov 29 05:46:24 crc kubenswrapper[4594]: I1129 05:46:24.109797 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="ceilometer-notification-agent" containerID="cri-o://e23bde88c62dae985d6ce355ec9b121be2e4f9e6056bb8b6525e1f64b9c47a4f" gracePeriod=30 Nov 29 05:46:24 crc kubenswrapper[4594]: I1129 05:46:24.116298 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f4133eb-5349-4bad-a993-d4e880a2f1be","Type":"ContainerStarted","Data":"29e4d46f073dd2522be2fb94e27bf83a9b54f4c6bc06a31128935fed788ef0a0"} Nov 29 05:46:24 crc kubenswrapper[4594]: I1129 05:46:24.137996 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.137979271 podStartE2EDuration="3.137979271s" podCreationTimestamp="2025-11-29 05:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:46:24.128958777 +0000 UTC m=+1108.369467997" watchObservedRunningTime="2025-11-29 05:46:24.137979271 +0000 UTC m=+1108.378488492" Nov 29 05:46:24 crc kubenswrapper[4594]: I1129 05:46:24.157404 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.678280123 podStartE2EDuration="7.157383677s" podCreationTimestamp="2025-11-29 05:46:17 +0000 UTC" firstStartedPulling="2025-11-29 05:46:18.843185597 +0000 UTC m=+1103.083694817" lastFinishedPulling="2025-11-29 05:46:23.322289152 +0000 UTC m=+1107.562798371" observedRunningTime="2025-11-29 05:46:24.151690078 +0000 UTC m=+1108.392199299" watchObservedRunningTime="2025-11-29 05:46:24.157383677 +0000 UTC m=+1108.397892887" Nov 29 05:46:25 crc kubenswrapper[4594]: I1129 05:46:25.135870 4594 generic.go:334] "Generic (PLEG): container finished" podID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerID="8e18995c55407fd7afa205d990b5c5e332d9faf140edf57e6fa8da0a699b7ec6" exitCode=0 Nov 29 05:46:25 crc kubenswrapper[4594]: I1129 05:46:25.135905 4594 generic.go:334] "Generic (PLEG): container finished" podID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerID="d94b05b53343aa758629f98158d0a2624ffcbbdf9f14205d7deced63722bb075" exitCode=2 Nov 29 05:46:25 crc kubenswrapper[4594]: I1129 05:46:25.135912 4594 generic.go:334] "Generic (PLEG): container finished" podID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerID="e23bde88c62dae985d6ce355ec9b121be2e4f9e6056bb8b6525e1f64b9c47a4f" exitCode=0 Nov 29 05:46:25 crc kubenswrapper[4594]: I1129 05:46:25.136003 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c12c56f3-363e-4998-9c43-51d5903ba7fe","Type":"ContainerDied","Data":"8e18995c55407fd7afa205d990b5c5e332d9faf140edf57e6fa8da0a699b7ec6"} Nov 29 05:46:25 crc kubenswrapper[4594]: I1129 05:46:25.136055 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c12c56f3-363e-4998-9c43-51d5903ba7fe","Type":"ContainerDied","Data":"d94b05b53343aa758629f98158d0a2624ffcbbdf9f14205d7deced63722bb075"} Nov 29 05:46:25 crc kubenswrapper[4594]: I1129 05:46:25.136071 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c12c56f3-363e-4998-9c43-51d5903ba7fe","Type":"ContainerDied","Data":"e23bde88c62dae985d6ce355ec9b121be2e4f9e6056bb8b6525e1f64b9c47a4f"} Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.195144 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f4133eb-5349-4bad-a993-d4e880a2f1be","Type":"ContainerStarted","Data":"da4821e14c6aceea8d03c876bb4458a8674868f46675d4e84546710eac38d64d"} Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.201724 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p6cqb" event={"ID":"de4685b4-8bf9-4a38-ba5e-7062994790c8","Type":"ContainerStarted","Data":"589add5bafc50b7a89058d95a6bae5bc2c81dfd4d598b6a855d23afdc0c03ff1"} Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.205807 4594 generic.go:334] "Generic (PLEG): container finished" podID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerID="e4049b46aed09ed3eee3598ff47657ab9a7b329b93d5c7868e50be98d61c688c" exitCode=0 Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.205862 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c12c56f3-363e-4998-9c43-51d5903ba7fe","Type":"ContainerDied","Data":"e4049b46aed09ed3eee3598ff47657ab9a7b329b93d5c7868e50be98d61c688c"} Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.232099 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.232082309 podStartE2EDuration="7.232082309s" podCreationTimestamp="2025-11-29 05:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:46:29.221353932 +0000 UTC m=+1113.461863152" watchObservedRunningTime="2025-11-29 05:46:29.232082309 +0000 UTC m=+1113.472591529" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.252322 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-p6cqb" podStartSLOduration=2.435621773 podStartE2EDuration="14.252289794s" podCreationTimestamp="2025-11-29 05:46:15 +0000 UTC" firstStartedPulling="2025-11-29 05:46:16.406990789 +0000 UTC m=+1100.647500009" lastFinishedPulling="2025-11-29 05:46:28.22365881 +0000 UTC m=+1112.464168030" observedRunningTime="2025-11-29 05:46:29.238574469 +0000 UTC m=+1113.479083688" watchObservedRunningTime="2025-11-29 05:46:29.252289794 +0000 UTC m=+1113.492799014" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.498561 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.647896 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-sg-core-conf-yaml\") pod \"c12c56f3-363e-4998-9c43-51d5903ba7fe\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.647969 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-scripts\") pod \"c12c56f3-363e-4998-9c43-51d5903ba7fe\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.648006 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-config-data\") pod \"c12c56f3-363e-4998-9c43-51d5903ba7fe\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.648033 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-combined-ca-bundle\") pod \"c12c56f3-363e-4998-9c43-51d5903ba7fe\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.648065 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-629m8\" (UniqueName: \"kubernetes.io/projected/c12c56f3-363e-4998-9c43-51d5903ba7fe-kube-api-access-629m8\") pod \"c12c56f3-363e-4998-9c43-51d5903ba7fe\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.648118 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c12c56f3-363e-4998-9c43-51d5903ba7fe-log-httpd\") pod \"c12c56f3-363e-4998-9c43-51d5903ba7fe\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.648158 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c12c56f3-363e-4998-9c43-51d5903ba7fe-run-httpd\") pod \"c12c56f3-363e-4998-9c43-51d5903ba7fe\" (UID: \"c12c56f3-363e-4998-9c43-51d5903ba7fe\") " Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.648658 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c12c56f3-363e-4998-9c43-51d5903ba7fe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c12c56f3-363e-4998-9c43-51d5903ba7fe" (UID: "c12c56f3-363e-4998-9c43-51d5903ba7fe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.648941 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c12c56f3-363e-4998-9c43-51d5903ba7fe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c12c56f3-363e-4998-9c43-51d5903ba7fe" (UID: "c12c56f3-363e-4998-9c43-51d5903ba7fe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.659387 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-scripts" (OuterVolumeSpecName: "scripts") pod "c12c56f3-363e-4998-9c43-51d5903ba7fe" (UID: "c12c56f3-363e-4998-9c43-51d5903ba7fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.659564 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c12c56f3-363e-4998-9c43-51d5903ba7fe-kube-api-access-629m8" (OuterVolumeSpecName: "kube-api-access-629m8") pod "c12c56f3-363e-4998-9c43-51d5903ba7fe" (UID: "c12c56f3-363e-4998-9c43-51d5903ba7fe"). InnerVolumeSpecName "kube-api-access-629m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.678021 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c12c56f3-363e-4998-9c43-51d5903ba7fe" (UID: "c12c56f3-363e-4998-9c43-51d5903ba7fe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.712999 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c12c56f3-363e-4998-9c43-51d5903ba7fe" (UID: "c12c56f3-363e-4998-9c43-51d5903ba7fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.728534 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-config-data" (OuterVolumeSpecName: "config-data") pod "c12c56f3-363e-4998-9c43-51d5903ba7fe" (UID: "c12c56f3-363e-4998-9c43-51d5903ba7fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.751741 4594 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c12c56f3-363e-4998-9c43-51d5903ba7fe-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.751772 4594 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c12c56f3-363e-4998-9c43-51d5903ba7fe-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.751792 4594 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.751807 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.751819 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.751832 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c12c56f3-363e-4998-9c43-51d5903ba7fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:29 crc kubenswrapper[4594]: I1129 05:46:29.751843 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-629m8\" (UniqueName: \"kubernetes.io/projected/c12c56f3-363e-4998-9c43-51d5903ba7fe-kube-api-access-629m8\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.216737 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c12c56f3-363e-4998-9c43-51d5903ba7fe","Type":"ContainerDied","Data":"03337fcc1359bbf8a03919e9740cd24d8d5dc69761d2a63511137bd6c61c7820"} Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.216818 4594 scope.go:117] "RemoveContainer" containerID="8e18995c55407fd7afa205d990b5c5e332d9faf140edf57e6fa8da0a699b7ec6" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.216849 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.245493 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.255136 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.262734 4594 scope.go:117] "RemoveContainer" containerID="d94b05b53343aa758629f98158d0a2624ffcbbdf9f14205d7deced63722bb075" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.266773 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:30 crc kubenswrapper[4594]: E1129 05:46:30.267312 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="ceilometer-notification-agent" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.267337 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="ceilometer-notification-agent" Nov 29 05:46:30 crc kubenswrapper[4594]: E1129 05:46:30.267361 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="ceilometer-central-agent" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.267370 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="ceilometer-central-agent" Nov 29 05:46:30 crc kubenswrapper[4594]: E1129 05:46:30.267389 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="proxy-httpd" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.267395 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="proxy-httpd" Nov 29 05:46:30 crc kubenswrapper[4594]: E1129 05:46:30.267417 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="sg-core" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.267422 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="sg-core" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.267658 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="ceilometer-notification-agent" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.267677 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="sg-core" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.267687 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="proxy-httpd" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.267702 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" containerName="ceilometer-central-agent" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.269532 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.275532 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.275544 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.288107 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.294965 4594 scope.go:117] "RemoveContainer" containerID="e23bde88c62dae985d6ce355ec9b121be2e4f9e6056bb8b6525e1f64b9c47a4f" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.314469 4594 scope.go:117] "RemoveContainer" containerID="e4049b46aed09ed3eee3598ff47657ab9a7b329b93d5c7868e50be98d61c688c" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.365904 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-scripts\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.366150 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.366574 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.366614 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/532aef10-6fe3-4820-b619-f94f07fa5d43-run-httpd\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.366659 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrcj4\" (UniqueName: \"kubernetes.io/projected/532aef10-6fe3-4820-b619-f94f07fa5d43-kube-api-access-xrcj4\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.366994 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-config-data\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.367110 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/532aef10-6fe3-4820-b619-f94f07fa5d43-log-httpd\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.400099 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.432949 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.469179 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.469227 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/532aef10-6fe3-4820-b619-f94f07fa5d43-run-httpd\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.469280 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrcj4\" (UniqueName: \"kubernetes.io/projected/532aef10-6fe3-4820-b619-f94f07fa5d43-kube-api-access-xrcj4\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.469369 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-config-data\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.469407 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/532aef10-6fe3-4820-b619-f94f07fa5d43-log-httpd\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.469439 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-scripts\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.469520 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.469880 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/532aef10-6fe3-4820-b619-f94f07fa5d43-run-httpd\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.470284 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/532aef10-6fe3-4820-b619-f94f07fa5d43-log-httpd\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.474453 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.474638 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.475188 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-config-data\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.478512 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-scripts\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.484876 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrcj4\" (UniqueName: \"kubernetes.io/projected/532aef10-6fe3-4820-b619-f94f07fa5d43-kube-api-access-xrcj4\") pod \"ceilometer-0\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " pod="openstack/ceilometer-0" Nov 29 05:46:30 crc kubenswrapper[4594]: I1129 05:46:30.594632 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:46:31 crc kubenswrapper[4594]: I1129 05:46:31.027813 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:46:31 crc kubenswrapper[4594]: W1129 05:46:31.028220 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod532aef10_6fe3_4820_b619_f94f07fa5d43.slice/crio-69514fbeccb5dfabbb6b33d200c55dde1fb615cb266eddbf23ccf5b42ccb1382 WatchSource:0}: Error finding container 69514fbeccb5dfabbb6b33d200c55dde1fb615cb266eddbf23ccf5b42ccb1382: Status 404 returned error can't find the container with id 69514fbeccb5dfabbb6b33d200c55dde1fb615cb266eddbf23ccf5b42ccb1382 Nov 29 05:46:31 crc kubenswrapper[4594]: I1129 05:46:31.031039 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 05:46:31 crc kubenswrapper[4594]: I1129 05:46:31.232958 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"532aef10-6fe3-4820-b619-f94f07fa5d43","Type":"ContainerStarted","Data":"69514fbeccb5dfabbb6b33d200c55dde1fb615cb266eddbf23ccf5b42ccb1382"} Nov 29 05:46:31 crc kubenswrapper[4594]: I1129 05:46:31.233012 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:31 crc kubenswrapper[4594]: I1129 05:46:31.261554 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Nov 29 05:46:31 crc kubenswrapper[4594]: I1129 05:46:31.552278 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 05:46:31 crc kubenswrapper[4594]: I1129 05:46:31.552321 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 05:46:31 crc kubenswrapper[4594]: I1129 05:46:31.581638 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 05:46:31 crc kubenswrapper[4594]: I1129 05:46:31.586076 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 05:46:32 crc kubenswrapper[4594]: I1129 05:46:32.094852 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c12c56f3-363e-4998-9c43-51d5903ba7fe" path="/var/lib/kubelet/pods/c12c56f3-363e-4998-9c43-51d5903ba7fe/volumes" Nov 29 05:46:32 crc kubenswrapper[4594]: I1129 05:46:32.256488 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"532aef10-6fe3-4820-b619-f94f07fa5d43","Type":"ContainerStarted","Data":"2ba957e4d359bd316048e313be1e1ab17a747f386fb91d77a2c9b962f262daf3"} Nov 29 05:46:32 crc kubenswrapper[4594]: I1129 05:46:32.257523 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 05:46:32 crc kubenswrapper[4594]: I1129 05:46:32.257585 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 05:46:32 crc kubenswrapper[4594]: I1129 05:46:32.469896 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 05:46:32 crc kubenswrapper[4594]: I1129 05:46:32.470189 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 05:46:32 crc kubenswrapper[4594]: I1129 05:46:32.504396 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 05:46:32 crc kubenswrapper[4594]: I1129 05:46:32.513791 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 05:46:33 crc kubenswrapper[4594]: I1129 05:46:33.265835 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"532aef10-6fe3-4820-b619-f94f07fa5d43","Type":"ContainerStarted","Data":"f3e9b7024c68ce7e91300e5ffa7c5e6e23948505fdc56db5f6e3ce5d0348ede5"} Nov 29 05:46:33 crc kubenswrapper[4594]: I1129 05:46:33.266685 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 05:46:33 crc kubenswrapper[4594]: I1129 05:46:33.266725 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 05:46:34 crc kubenswrapper[4594]: I1129 05:46:34.003797 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 05:46:34 crc kubenswrapper[4594]: I1129 05:46:34.104347 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 05:46:35 crc kubenswrapper[4594]: I1129 05:46:35.079779 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 05:46:35 crc kubenswrapper[4594]: I1129 05:46:35.084375 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 05:46:35 crc kubenswrapper[4594]: I1129 05:46:35.290510 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"532aef10-6fe3-4820-b619-f94f07fa5d43","Type":"ContainerStarted","Data":"2c4814d9889fc0a1908ca851253ea5f09697f1552c79fd3bb12cd5b8980910ad"} Nov 29 05:46:36 crc kubenswrapper[4594]: I1129 05:46:36.306353 4594 generic.go:334] "Generic (PLEG): container finished" podID="de4685b4-8bf9-4a38-ba5e-7062994790c8" containerID="589add5bafc50b7a89058d95a6bae5bc2c81dfd4d598b6a855d23afdc0c03ff1" exitCode=0 Nov 29 05:46:36 crc kubenswrapper[4594]: I1129 05:46:36.306570 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p6cqb" event={"ID":"de4685b4-8bf9-4a38-ba5e-7062994790c8","Type":"ContainerDied","Data":"589add5bafc50b7a89058d95a6bae5bc2c81dfd4d598b6a855d23afdc0c03ff1"} Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.320803 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"532aef10-6fe3-4820-b619-f94f07fa5d43","Type":"ContainerStarted","Data":"9bc3f947b0549f72f61c69217bbb80101779f1d763878071f798b259e2c7b1a1"} Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.321146 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.343481 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.261913425 podStartE2EDuration="7.343458784s" podCreationTimestamp="2025-11-29 05:46:30 +0000 UTC" firstStartedPulling="2025-11-29 05:46:31.030837397 +0000 UTC m=+1115.271346606" lastFinishedPulling="2025-11-29 05:46:36.112382756 +0000 UTC m=+1120.352891965" observedRunningTime="2025-11-29 05:46:37.337151511 +0000 UTC m=+1121.577660732" watchObservedRunningTime="2025-11-29 05:46:37.343458784 +0000 UTC m=+1121.583968004" Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.670858 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.748315 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5t9q\" (UniqueName: \"kubernetes.io/projected/de4685b4-8bf9-4a38-ba5e-7062994790c8-kube-api-access-m5t9q\") pod \"de4685b4-8bf9-4a38-ba5e-7062994790c8\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.748488 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-config-data\") pod \"de4685b4-8bf9-4a38-ba5e-7062994790c8\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.748550 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-combined-ca-bundle\") pod \"de4685b4-8bf9-4a38-ba5e-7062994790c8\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.748775 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-scripts\") pod \"de4685b4-8bf9-4a38-ba5e-7062994790c8\" (UID: \"de4685b4-8bf9-4a38-ba5e-7062994790c8\") " Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.756715 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-scripts" (OuterVolumeSpecName: "scripts") pod "de4685b4-8bf9-4a38-ba5e-7062994790c8" (UID: "de4685b4-8bf9-4a38-ba5e-7062994790c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.761001 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4685b4-8bf9-4a38-ba5e-7062994790c8-kube-api-access-m5t9q" (OuterVolumeSpecName: "kube-api-access-m5t9q") pod "de4685b4-8bf9-4a38-ba5e-7062994790c8" (UID: "de4685b4-8bf9-4a38-ba5e-7062994790c8"). InnerVolumeSpecName "kube-api-access-m5t9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.788045 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de4685b4-8bf9-4a38-ba5e-7062994790c8" (UID: "de4685b4-8bf9-4a38-ba5e-7062994790c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.790366 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-config-data" (OuterVolumeSpecName: "config-data") pod "de4685b4-8bf9-4a38-ba5e-7062994790c8" (UID: "de4685b4-8bf9-4a38-ba5e-7062994790c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.851389 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.851420 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5t9q\" (UniqueName: \"kubernetes.io/projected/de4685b4-8bf9-4a38-ba5e-7062994790c8-kube-api-access-m5t9q\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.851434 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:37 crc kubenswrapper[4594]: I1129 05:46:37.851444 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4685b4-8bf9-4a38-ba5e-7062994790c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.332530 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p6cqb" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.333086 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p6cqb" event={"ID":"de4685b4-8bf9-4a38-ba5e-7062994790c8","Type":"ContainerDied","Data":"4c35ab24a9357f350454b41fca8d92d74b3166e08c5799be0a16433bffaeae77"} Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.333119 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c35ab24a9357f350454b41fca8d92d74b3166e08c5799be0a16433bffaeae77" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.425445 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 05:46:38 crc kubenswrapper[4594]: E1129 05:46:38.426101 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4685b4-8bf9-4a38-ba5e-7062994790c8" containerName="nova-cell0-conductor-db-sync" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.426129 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4685b4-8bf9-4a38-ba5e-7062994790c8" containerName="nova-cell0-conductor-db-sync" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.426455 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="de4685b4-8bf9-4a38-ba5e-7062994790c8" containerName="nova-cell0-conductor-db-sync" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.427331 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.429675 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-m55vf" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.429706 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.435837 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.571451 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81a5ed74-39d5-4b41-8083-04c1a6f6f119-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"81a5ed74-39d5-4b41-8083-04c1a6f6f119\") " pod="openstack/nova-cell0-conductor-0" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.571743 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a5ed74-39d5-4b41-8083-04c1a6f6f119-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"81a5ed74-39d5-4b41-8083-04c1a6f6f119\") " pod="openstack/nova-cell0-conductor-0" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.571817 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr4x6\" (UniqueName: \"kubernetes.io/projected/81a5ed74-39d5-4b41-8083-04c1a6f6f119-kube-api-access-vr4x6\") pod \"nova-cell0-conductor-0\" (UID: \"81a5ed74-39d5-4b41-8083-04c1a6f6f119\") " pod="openstack/nova-cell0-conductor-0" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.672869 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81a5ed74-39d5-4b41-8083-04c1a6f6f119-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"81a5ed74-39d5-4b41-8083-04c1a6f6f119\") " pod="openstack/nova-cell0-conductor-0" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.672922 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a5ed74-39d5-4b41-8083-04c1a6f6f119-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"81a5ed74-39d5-4b41-8083-04c1a6f6f119\") " pod="openstack/nova-cell0-conductor-0" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.672991 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr4x6\" (UniqueName: \"kubernetes.io/projected/81a5ed74-39d5-4b41-8083-04c1a6f6f119-kube-api-access-vr4x6\") pod \"nova-cell0-conductor-0\" (UID: \"81a5ed74-39d5-4b41-8083-04c1a6f6f119\") " pod="openstack/nova-cell0-conductor-0" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.677514 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81a5ed74-39d5-4b41-8083-04c1a6f6f119-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"81a5ed74-39d5-4b41-8083-04c1a6f6f119\") " pod="openstack/nova-cell0-conductor-0" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.687205 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr4x6\" (UniqueName: \"kubernetes.io/projected/81a5ed74-39d5-4b41-8083-04c1a6f6f119-kube-api-access-vr4x6\") pod \"nova-cell0-conductor-0\" (UID: \"81a5ed74-39d5-4b41-8083-04c1a6f6f119\") " pod="openstack/nova-cell0-conductor-0" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.687400 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a5ed74-39d5-4b41-8083-04c1a6f6f119-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"81a5ed74-39d5-4b41-8083-04c1a6f6f119\") " pod="openstack/nova-cell0-conductor-0" Nov 29 05:46:38 crc kubenswrapper[4594]: I1129 05:46:38.745135 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 05:46:39 crc kubenswrapper[4594]: I1129 05:46:39.182456 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 05:46:39 crc kubenswrapper[4594]: I1129 05:46:39.347657 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"81a5ed74-39d5-4b41-8083-04c1a6f6f119","Type":"ContainerStarted","Data":"7f06195843387fe182f04277f33987b3c0342f776a0531189974a3f398f77dc0"} Nov 29 05:46:40 crc kubenswrapper[4594]: I1129 05:46:40.359277 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"81a5ed74-39d5-4b41-8083-04c1a6f6f119","Type":"ContainerStarted","Data":"6b6d0794593032956515cbc420c7be165b1b77c94885524f48cb7e7391603c96"} Nov 29 05:46:40 crc kubenswrapper[4594]: I1129 05:46:40.360162 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 29 05:46:48 crc kubenswrapper[4594]: I1129 05:46:48.773910 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 29 05:46:48 crc kubenswrapper[4594]: I1129 05:46:48.794854 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=10.794838953 podStartE2EDuration="10.794838953s" podCreationTimestamp="2025-11-29 05:46:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:46:40.385129786 +0000 UTC m=+1124.625639006" watchObservedRunningTime="2025-11-29 05:46:48.794838953 +0000 UTC m=+1133.035348173" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.271991 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-cxpv7"] Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.273484 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.275306 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.275498 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.291466 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cxpv7"] Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.390059 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-scripts\") pod \"nova-cell0-cell-mapping-cxpv7\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.390127 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-config-data\") pod \"nova-cell0-cell-mapping-cxpv7\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.390217 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvhlf\" (UniqueName: \"kubernetes.io/projected/b3e5dae6-ceb8-4878-a37d-56f97d59d103-kube-api-access-cvhlf\") pod \"nova-cell0-cell-mapping-cxpv7\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.390274 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cxpv7\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.444730 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.446732 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.450342 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.456224 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.462690 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.465367 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.484434 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.496412 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvhlf\" (UniqueName: \"kubernetes.io/projected/b3e5dae6-ceb8-4878-a37d-56f97d59d103-kube-api-access-cvhlf\") pod \"nova-cell0-cell-mapping-cxpv7\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.496530 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cxpv7\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.496693 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-scripts\") pod \"nova-cell0-cell-mapping-cxpv7\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.496775 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-config-data\") pod \"nova-cell0-cell-mapping-cxpv7\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.504854 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cxpv7\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.505420 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-scripts\") pod \"nova-cell0-cell-mapping-cxpv7\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.506081 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-config-data\") pod \"nova-cell0-cell-mapping-cxpv7\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.527596 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvhlf\" (UniqueName: \"kubernetes.io/projected/b3e5dae6-ceb8-4878-a37d-56f97d59d103-kube-api-access-cvhlf\") pod \"nova-cell0-cell-mapping-cxpv7\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.573317 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.593736 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.600073 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602dee9d-18e0-42f2-b23f-e74cc723e9ed-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " pod="openstack/nova-api-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.600125 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfvtg\" (UniqueName: \"kubernetes.io/projected/602dee9d-18e0-42f2-b23f-e74cc723e9ed-kube-api-access-cfvtg\") pod \"nova-api-0\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " pod="openstack/nova-api-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.600169 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df3200b-0773-4e3c-9110-fe893d7efc79-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1df3200b-0773-4e3c-9110-fe893d7efc79\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.600307 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/602dee9d-18e0-42f2-b23f-e74cc723e9ed-logs\") pod \"nova-api-0\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " pod="openstack/nova-api-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.600360 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngx5v\" (UniqueName: \"kubernetes.io/projected/1df3200b-0773-4e3c-9110-fe893d7efc79-kube-api-access-ngx5v\") pod \"nova-cell1-novncproxy-0\" (UID: \"1df3200b-0773-4e3c-9110-fe893d7efc79\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.600456 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df3200b-0773-4e3c-9110-fe893d7efc79-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1df3200b-0773-4e3c-9110-fe893d7efc79\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.600528 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602dee9d-18e0-42f2-b23f-e74cc723e9ed-config-data\") pod \"nova-api-0\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " pod="openstack/nova-api-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.702234 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.704303 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602dee9d-18e0-42f2-b23f-e74cc723e9ed-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " pod="openstack/nova-api-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.704434 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfvtg\" (UniqueName: \"kubernetes.io/projected/602dee9d-18e0-42f2-b23f-e74cc723e9ed-kube-api-access-cfvtg\") pod \"nova-api-0\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " pod="openstack/nova-api-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.704547 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df3200b-0773-4e3c-9110-fe893d7efc79-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1df3200b-0773-4e3c-9110-fe893d7efc79\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.704747 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/602dee9d-18e0-42f2-b23f-e74cc723e9ed-logs\") pod \"nova-api-0\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " pod="openstack/nova-api-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.704860 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngx5v\" (UniqueName: \"kubernetes.io/projected/1df3200b-0773-4e3c-9110-fe893d7efc79-kube-api-access-ngx5v\") pod \"nova-cell1-novncproxy-0\" (UID: \"1df3200b-0773-4e3c-9110-fe893d7efc79\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.705014 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df3200b-0773-4e3c-9110-fe893d7efc79-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1df3200b-0773-4e3c-9110-fe893d7efc79\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.705115 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602dee9d-18e0-42f2-b23f-e74cc723e9ed-config-data\") pod \"nova-api-0\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " pod="openstack/nova-api-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.710086 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.711171 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/602dee9d-18e0-42f2-b23f-e74cc723e9ed-logs\") pod \"nova-api-0\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " pod="openstack/nova-api-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.713064 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602dee9d-18e0-42f2-b23f-e74cc723e9ed-config-data\") pod \"nova-api-0\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " pod="openstack/nova-api-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.716767 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602dee9d-18e0-42f2-b23f-e74cc723e9ed-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " pod="openstack/nova-api-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.721322 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.721375 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df3200b-0773-4e3c-9110-fe893d7efc79-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1df3200b-0773-4e3c-9110-fe893d7efc79\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.732875 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df3200b-0773-4e3c-9110-fe893d7efc79-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1df3200b-0773-4e3c-9110-fe893d7efc79\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.747911 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.760961 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngx5v\" (UniqueName: \"kubernetes.io/projected/1df3200b-0773-4e3c-9110-fe893d7efc79-kube-api-access-ngx5v\") pod \"nova-cell1-novncproxy-0\" (UID: \"1df3200b-0773-4e3c-9110-fe893d7efc79\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.780694 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfvtg\" (UniqueName: \"kubernetes.io/projected/602dee9d-18e0-42f2-b23f-e74cc723e9ed-kube-api-access-cfvtg\") pod \"nova-api-0\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " pod="openstack/nova-api-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.785580 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.823997 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.827344 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.829806 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdb07b4-07da-4f88-acb5-4e28b684cb22-config-data\") pod \"nova-scheduler-0\" (UID: \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\") " pod="openstack/nova-scheduler-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.830049 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvx6t\" (UniqueName: \"kubernetes.io/projected/fcdb07b4-07da-4f88-acb5-4e28b684cb22-kube-api-access-mvx6t\") pod \"nova-scheduler-0\" (UID: \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\") " pod="openstack/nova-scheduler-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.830163 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdb07b4-07da-4f88-acb5-4e28b684cb22-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\") " pod="openstack/nova-scheduler-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.833436 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.912143 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.941066 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvx6t\" (UniqueName: \"kubernetes.io/projected/fcdb07b4-07da-4f88-acb5-4e28b684cb22-kube-api-access-mvx6t\") pod \"nova-scheduler-0\" (UID: \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\") " pod="openstack/nova-scheduler-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.941205 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " pod="openstack/nova-metadata-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.941264 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdb07b4-07da-4f88-acb5-4e28b684cb22-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\") " pod="openstack/nova-scheduler-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.941399 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-logs\") pod \"nova-metadata-0\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " pod="openstack/nova-metadata-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.941442 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24j2t\" (UniqueName: \"kubernetes.io/projected/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-kube-api-access-24j2t\") pod \"nova-metadata-0\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " pod="openstack/nova-metadata-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.941513 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdb07b4-07da-4f88-acb5-4e28b684cb22-config-data\") pod \"nova-scheduler-0\" (UID: \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\") " pod="openstack/nova-scheduler-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.942089 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-config-data\") pod \"nova-metadata-0\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " pod="openstack/nova-metadata-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.946848 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-56pgf"] Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.947537 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdb07b4-07da-4f88-acb5-4e28b684cb22-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\") " pod="openstack/nova-scheduler-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.948901 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.954877 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-56pgf"] Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.961991 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdb07b4-07da-4f88-acb5-4e28b684cb22-config-data\") pod \"nova-scheduler-0\" (UID: \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\") " pod="openstack/nova-scheduler-0" Nov 29 05:46:49 crc kubenswrapper[4594]: I1129 05:46:49.977197 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvx6t\" (UniqueName: \"kubernetes.io/projected/fcdb07b4-07da-4f88-acb5-4e28b684cb22-kube-api-access-mvx6t\") pod \"nova-scheduler-0\" (UID: \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\") " pod="openstack/nova-scheduler-0" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.044910 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxhcw\" (UniqueName: \"kubernetes.io/projected/e74d9bfd-1e04-4878-aa71-24403f17ebf5-kube-api-access-wxhcw\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.045123 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-config\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.045334 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-config-data\") pod \"nova-metadata-0\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " pod="openstack/nova-metadata-0" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.045565 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-ovsdbserver-sb\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.045611 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-dns-swift-storage-0\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.045833 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-dns-svc\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.045901 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " pod="openstack/nova-metadata-0" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.045958 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-ovsdbserver-nb\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.046013 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-logs\") pod \"nova-metadata-0\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " pod="openstack/nova-metadata-0" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.046057 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24j2t\" (UniqueName: \"kubernetes.io/projected/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-kube-api-access-24j2t\") pod \"nova-metadata-0\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " pod="openstack/nova-metadata-0" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.047696 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-logs\") pod \"nova-metadata-0\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " pod="openstack/nova-metadata-0" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.050450 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-config-data\") pod \"nova-metadata-0\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " pod="openstack/nova-metadata-0" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.052359 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " pod="openstack/nova-metadata-0" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.063107 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24j2t\" (UniqueName: \"kubernetes.io/projected/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-kube-api-access-24j2t\") pod \"nova-metadata-0\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " pod="openstack/nova-metadata-0" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.066695 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.086032 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.149365 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-ovsdbserver-sb\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.149413 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-dns-swift-storage-0\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.149486 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-dns-svc\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.149522 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-ovsdbserver-nb\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.149560 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxhcw\" (UniqueName: \"kubernetes.io/projected/e74d9bfd-1e04-4878-aa71-24403f17ebf5-kube-api-access-wxhcw\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.149592 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-config\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.150737 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.151149 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-dns-swift-storage-0\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.151603 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-dns-svc\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.151932 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-config\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.152211 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-ovsdbserver-nb\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.154014 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-ovsdbserver-sb\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.166128 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxhcw\" (UniqueName: \"kubernetes.io/projected/e74d9bfd-1e04-4878-aa71-24403f17ebf5-kube-api-access-wxhcw\") pod \"dnsmasq-dns-844fc57f6f-56pgf\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.292712 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cxpv7"] Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.293023 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.404580 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.469323 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1df3200b-0773-4e3c-9110-fe893d7efc79","Type":"ContainerStarted","Data":"dbe4dd50c6c19c8961061843021c0918c1e13e235f967ede8de61aa5d1726358"} Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.470542 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cxpv7" event={"ID":"b3e5dae6-ceb8-4878-a37d-56f97d59d103","Type":"ContainerStarted","Data":"0028ad06bd64594786fe6c9d23b77762582f440e53d5823952dc16080c073b87"} Nov 29 05:46:50 crc kubenswrapper[4594]: W1129 05:46:50.578632 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod602dee9d_18e0_42f2_b23f_e74cc723e9ed.slice/crio-776caf6b0b2609b3e203dc35b6b40e098ac043443f5f2a03ddb800fde47b38a7 WatchSource:0}: Error finding container 776caf6b0b2609b3e203dc35b6b40e098ac043443f5f2a03ddb800fde47b38a7: Status 404 returned error can't find the container with id 776caf6b0b2609b3e203dc35b6b40e098ac043443f5f2a03ddb800fde47b38a7 Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.589013 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.609823 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.697830 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7mb5g"] Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.699219 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.702767 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.703575 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.708746 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7mb5g"] Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.772613 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.780373 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-56pgf"] Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.868828 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-config-data\") pod \"nova-cell1-conductor-db-sync-7mb5g\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.869028 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pc54\" (UniqueName: \"kubernetes.io/projected/19e2ee6f-48e6-470c-84c9-6b626fdefef7-kube-api-access-7pc54\") pod \"nova-cell1-conductor-db-sync-7mb5g\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.869314 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-scripts\") pod \"nova-cell1-conductor-db-sync-7mb5g\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.870024 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7mb5g\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.972440 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-scripts\") pod \"nova-cell1-conductor-db-sync-7mb5g\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.972762 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7mb5g\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.972840 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-config-data\") pod \"nova-cell1-conductor-db-sync-7mb5g\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.972898 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pc54\" (UniqueName: \"kubernetes.io/projected/19e2ee6f-48e6-470c-84c9-6b626fdefef7-kube-api-access-7pc54\") pod \"nova-cell1-conductor-db-sync-7mb5g\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.977898 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-scripts\") pod \"nova-cell1-conductor-db-sync-7mb5g\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.977947 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7mb5g\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.977970 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-config-data\") pod \"nova-cell1-conductor-db-sync-7mb5g\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:50 crc kubenswrapper[4594]: I1129 05:46:50.992733 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pc54\" (UniqueName: \"kubernetes.io/projected/19e2ee6f-48e6-470c-84c9-6b626fdefef7-kube-api-access-7pc54\") pod \"nova-cell1-conductor-db-sync-7mb5g\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:51 crc kubenswrapper[4594]: I1129 05:46:51.027457 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:51 crc kubenswrapper[4594]: I1129 05:46:51.455781 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7mb5g"] Nov 29 05:46:51 crc kubenswrapper[4594]: I1129 05:46:51.479442 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcdb07b4-07da-4f88-acb5-4e28b684cb22","Type":"ContainerStarted","Data":"6bc8050008b83df69a7fd7ff5cb0f446e3f30d71ed58c59232cd53d0e75e1d5f"} Nov 29 05:46:51 crc kubenswrapper[4594]: I1129 05:46:51.480823 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"602dee9d-18e0-42f2-b23f-e74cc723e9ed","Type":"ContainerStarted","Data":"776caf6b0b2609b3e203dc35b6b40e098ac043443f5f2a03ddb800fde47b38a7"} Nov 29 05:46:51 crc kubenswrapper[4594]: I1129 05:46:51.488993 4594 generic.go:334] "Generic (PLEG): container finished" podID="e74d9bfd-1e04-4878-aa71-24403f17ebf5" containerID="267c596333c00cfa2ea8c7df63c3090b2225c0154df289306cac3a221dcc521e" exitCode=0 Nov 29 05:46:51 crc kubenswrapper[4594]: I1129 05:46:51.489352 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" event={"ID":"e74d9bfd-1e04-4878-aa71-24403f17ebf5","Type":"ContainerDied","Data":"267c596333c00cfa2ea8c7df63c3090b2225c0154df289306cac3a221dcc521e"} Nov 29 05:46:51 crc kubenswrapper[4594]: I1129 05:46:51.489393 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" event={"ID":"e74d9bfd-1e04-4878-aa71-24403f17ebf5","Type":"ContainerStarted","Data":"1c54ffbd04d1fc68aedb86cd5bff257adbd4b451e426f6b0490f667397aa9055"} Nov 29 05:46:51 crc kubenswrapper[4594]: I1129 05:46:51.491305 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cxpv7" event={"ID":"b3e5dae6-ceb8-4878-a37d-56f97d59d103","Type":"ContainerStarted","Data":"256afc9e6233f981bc8a70f9cb1c17c099179d185e2c26760218524be5c1591d"} Nov 29 05:46:51 crc kubenswrapper[4594]: I1129 05:46:51.500915 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81","Type":"ContainerStarted","Data":"714eaf42d0f2387fd9947acb2078ba25410ad3393e55981b0f3b47558fbd1fe4"} Nov 29 05:46:51 crc kubenswrapper[4594]: I1129 05:46:51.509422 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7mb5g" event={"ID":"19e2ee6f-48e6-470c-84c9-6b626fdefef7","Type":"ContainerStarted","Data":"8c238e3701278ef0bd9f3f081dfce2dbfa8876d0722feba3ae7de600e25f6de1"} Nov 29 05:46:51 crc kubenswrapper[4594]: I1129 05:46:51.532201 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-cxpv7" podStartSLOduration=2.53217769 podStartE2EDuration="2.53217769s" podCreationTimestamp="2025-11-29 05:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:46:51.526809743 +0000 UTC m=+1135.767318964" watchObservedRunningTime="2025-11-29 05:46:51.53217769 +0000 UTC m=+1135.772686929" Nov 29 05:46:52 crc kubenswrapper[4594]: I1129 05:46:52.528028 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7mb5g" event={"ID":"19e2ee6f-48e6-470c-84c9-6b626fdefef7","Type":"ContainerStarted","Data":"0fbb4b9dd828c88480c72368e61ac2b3e7f03355bfdebce7ee3d3f05d453f2d1"} Nov 29 05:46:52 crc kubenswrapper[4594]: I1129 05:46:52.544831 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" event={"ID":"e74d9bfd-1e04-4878-aa71-24403f17ebf5","Type":"ContainerStarted","Data":"71f23430873f351346ad0fc58f3e7b87545ce09c86854ea28d294c2767be8965"} Nov 29 05:46:52 crc kubenswrapper[4594]: I1129 05:46:52.544880 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:46:52 crc kubenswrapper[4594]: I1129 05:46:52.564748 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7mb5g" podStartSLOduration=2.564721182 podStartE2EDuration="2.564721182s" podCreationTimestamp="2025-11-29 05:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:46:52.542008305 +0000 UTC m=+1136.782517525" watchObservedRunningTime="2025-11-29 05:46:52.564721182 +0000 UTC m=+1136.805230402" Nov 29 05:46:52 crc kubenswrapper[4594]: I1129 05:46:52.583062 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" podStartSLOduration=3.583042459 podStartE2EDuration="3.583042459s" podCreationTimestamp="2025-11-29 05:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:46:52.557349017 +0000 UTC m=+1136.797858238" watchObservedRunningTime="2025-11-29 05:46:52.583042459 +0000 UTC m=+1136.823551680" Nov 29 05:46:53 crc kubenswrapper[4594]: I1129 05:46:53.262976 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 05:46:53 crc kubenswrapper[4594]: I1129 05:46:53.275527 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.611056 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcdb07b4-07da-4f88-acb5-4e28b684cb22","Type":"ContainerStarted","Data":"aba935b28d75cc52ee12171165bc73ed35d1909899b2a71ec23d305ccf8ff520"} Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.625703 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"602dee9d-18e0-42f2-b23f-e74cc723e9ed","Type":"ContainerStarted","Data":"a37b4498c59965f20f7818ac4d7defc36702a27dc00081b5a8549a86e3c067bb"} Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.625850 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"602dee9d-18e0-42f2-b23f-e74cc723e9ed","Type":"ContainerStarted","Data":"cc22d717976dd8e182efc2f2d868ec5d33b54f2513be104be0c9a8008f68084c"} Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.635726 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81","Type":"ContainerStarted","Data":"3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c"} Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.635773 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81","Type":"ContainerStarted","Data":"6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554"} Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.635908 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" containerName="nova-metadata-log" containerID="cri-o://6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554" gracePeriod=30 Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.636174 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" containerName="nova-metadata-metadata" containerID="cri-o://3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c" gracePeriod=30 Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.662617 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1df3200b-0773-4e3c-9110-fe893d7efc79","Type":"ContainerStarted","Data":"a73a37116984c4c20c166844a6cc203d06cbfda6802e806de7d667587d225a89"} Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.662646 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1df3200b-0773-4e3c-9110-fe893d7efc79" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a73a37116984c4c20c166844a6cc203d06cbfda6802e806de7d667587d225a89" gracePeriod=30 Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.664345 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.330012423 podStartE2EDuration="5.664325765s" podCreationTimestamp="2025-11-29 05:46:49 +0000 UTC" firstStartedPulling="2025-11-29 05:46:50.581558975 +0000 UTC m=+1134.822068196" lastFinishedPulling="2025-11-29 05:46:53.915872318 +0000 UTC m=+1138.156381538" observedRunningTime="2025-11-29 05:46:54.642043639 +0000 UTC m=+1138.882552859" watchObservedRunningTime="2025-11-29 05:46:54.664325765 +0000 UTC m=+1138.904834985" Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.664509 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.364564895 podStartE2EDuration="5.664505914s" podCreationTimestamp="2025-11-29 05:46:49 +0000 UTC" firstStartedPulling="2025-11-29 05:46:50.61238437 +0000 UTC m=+1134.852893589" lastFinishedPulling="2025-11-29 05:46:53.912325388 +0000 UTC m=+1138.152834608" observedRunningTime="2025-11-29 05:46:54.625545189 +0000 UTC m=+1138.866054409" watchObservedRunningTime="2025-11-29 05:46:54.664505914 +0000 UTC m=+1138.905015134" Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.672243 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.548853571 podStartE2EDuration="5.672234659s" podCreationTimestamp="2025-11-29 05:46:49 +0000 UTC" firstStartedPulling="2025-11-29 05:46:50.788951004 +0000 UTC m=+1135.029460223" lastFinishedPulling="2025-11-29 05:46:53.912332091 +0000 UTC m=+1138.152841311" observedRunningTime="2025-11-29 05:46:54.664997629 +0000 UTC m=+1138.905506849" watchObservedRunningTime="2025-11-29 05:46:54.672234659 +0000 UTC m=+1138.912743879" Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.691522 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.184915744 podStartE2EDuration="5.691512075s" podCreationTimestamp="2025-11-29 05:46:49 +0000 UTC" firstStartedPulling="2025-11-29 05:46:50.405646301 +0000 UTC m=+1134.646155521" lastFinishedPulling="2025-11-29 05:46:53.912242632 +0000 UTC m=+1138.152751852" observedRunningTime="2025-11-29 05:46:54.684547426 +0000 UTC m=+1138.925056647" watchObservedRunningTime="2025-11-29 05:46:54.691512075 +0000 UTC m=+1138.932021295" Nov 29 05:46:54 crc kubenswrapper[4594]: I1129 05:46:54.789333 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.089250 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.152267 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.152309 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.292932 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.395005 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-config-data\") pod \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.395108 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24j2t\" (UniqueName: \"kubernetes.io/projected/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-kube-api-access-24j2t\") pod \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.401277 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-kube-api-access-24j2t" (OuterVolumeSpecName: "kube-api-access-24j2t") pod "e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" (UID: "e4a97d2d-e221-4d1e-8d54-04fe41f0bf81"). InnerVolumeSpecName "kube-api-access-24j2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.419227 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-config-data" (OuterVolumeSpecName: "config-data") pod "e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" (UID: "e4a97d2d-e221-4d1e-8d54-04fe41f0bf81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.496901 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-combined-ca-bundle\") pod \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.497019 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-logs\") pod \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\" (UID: \"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81\") " Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.497343 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-logs" (OuterVolumeSpecName: "logs") pod "e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" (UID: "e4a97d2d-e221-4d1e-8d54-04fe41f0bf81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.497364 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.497436 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24j2t\" (UniqueName: \"kubernetes.io/projected/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-kube-api-access-24j2t\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.522434 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" (UID: "e4a97d2d-e221-4d1e-8d54-04fe41f0bf81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.599739 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.599778 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.676359 4594 generic.go:334] "Generic (PLEG): container finished" podID="e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" containerID="3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c" exitCode=0 Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.676412 4594 generic.go:334] "Generic (PLEG): container finished" podID="e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" containerID="6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554" exitCode=143 Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.676427 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81","Type":"ContainerDied","Data":"3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c"} Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.676476 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.676490 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81","Type":"ContainerDied","Data":"6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554"} Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.676512 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4a97d2d-e221-4d1e-8d54-04fe41f0bf81","Type":"ContainerDied","Data":"714eaf42d0f2387fd9947acb2078ba25410ad3393e55981b0f3b47558fbd1fe4"} Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.676533 4594 scope.go:117] "RemoveContainer" containerID="3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.719977 4594 scope.go:117] "RemoveContainer" containerID="6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.743608 4594 scope.go:117] "RemoveContainer" containerID="3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.744941 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:46:55 crc kubenswrapper[4594]: E1129 05:46:55.745563 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c\": container with ID starting with 3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c not found: ID does not exist" containerID="3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.745670 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c"} err="failed to get container status \"3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c\": rpc error: code = NotFound desc = could not find container \"3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c\": container with ID starting with 3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c not found: ID does not exist" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.745749 4594 scope.go:117] "RemoveContainer" containerID="6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554" Nov 29 05:46:55 crc kubenswrapper[4594]: E1129 05:46:55.746183 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554\": container with ID starting with 6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554 not found: ID does not exist" containerID="6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.746284 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554"} err="failed to get container status \"6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554\": rpc error: code = NotFound desc = could not find container \"6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554\": container with ID starting with 6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554 not found: ID does not exist" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.746350 4594 scope.go:117] "RemoveContainer" containerID="3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.746867 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c"} err="failed to get container status \"3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c\": rpc error: code = NotFound desc = could not find container \"3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c\": container with ID starting with 3eb8a07c3a474df9b065df0679149c1291334fc4e6c9b1e2093c980fd38fe37c not found: ID does not exist" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.747022 4594 scope.go:117] "RemoveContainer" containerID="6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.748431 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554"} err="failed to get container status \"6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554\": rpc error: code = NotFound desc = could not find container \"6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554\": container with ID starting with 6e228bb0b9384cd10aed8a4b2d7375f0e399fa6a0f228a3b1b154f5bc175f554 not found: ID does not exist" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.756083 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.763308 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:46:55 crc kubenswrapper[4594]: E1129 05:46:55.763863 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" containerName="nova-metadata-log" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.763886 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" containerName="nova-metadata-log" Nov 29 05:46:55 crc kubenswrapper[4594]: E1129 05:46:55.763901 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" containerName="nova-metadata-metadata" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.763909 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" containerName="nova-metadata-metadata" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.764157 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" containerName="nova-metadata-log" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.764184 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" containerName="nova-metadata-metadata" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.765443 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.768781 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.768860 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.769752 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.924855 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.924914 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.925055 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwcdl\" (UniqueName: \"kubernetes.io/projected/cfb657a5-c2ba-46ab-a833-36ccea778f84-kube-api-access-gwcdl\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.925133 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-config-data\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:55 crc kubenswrapper[4594]: I1129 05:46:55.925191 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb657a5-c2ba-46ab-a833-36ccea778f84-logs\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.026682 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb657a5-c2ba-46ab-a833-36ccea778f84-logs\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.027102 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb657a5-c2ba-46ab-a833-36ccea778f84-logs\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.027343 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.027442 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.027678 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwcdl\" (UniqueName: \"kubernetes.io/projected/cfb657a5-c2ba-46ab-a833-36ccea778f84-kube-api-access-gwcdl\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.028153 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-config-data\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.031221 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.031285 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.032407 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-config-data\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.044691 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwcdl\" (UniqueName: \"kubernetes.io/projected/cfb657a5-c2ba-46ab-a833-36ccea778f84-kube-api-access-gwcdl\") pod \"nova-metadata-0\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " pod="openstack/nova-metadata-0" Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.082736 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.094127 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a97d2d-e221-4d1e-8d54-04fe41f0bf81" path="/var/lib/kubelet/pods/e4a97d2d-e221-4d1e-8d54-04fe41f0bf81/volumes" Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.524417 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.690179 4594 generic.go:334] "Generic (PLEG): container finished" podID="19e2ee6f-48e6-470c-84c9-6b626fdefef7" containerID="0fbb4b9dd828c88480c72368e61ac2b3e7f03355bfdebce7ee3d3f05d453f2d1" exitCode=0 Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.690232 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7mb5g" event={"ID":"19e2ee6f-48e6-470c-84c9-6b626fdefef7","Type":"ContainerDied","Data":"0fbb4b9dd828c88480c72368e61ac2b3e7f03355bfdebce7ee3d3f05d453f2d1"} Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.692200 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfb657a5-c2ba-46ab-a833-36ccea778f84","Type":"ContainerStarted","Data":"bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6"} Nov 29 05:46:56 crc kubenswrapper[4594]: I1129 05:46:56.692279 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfb657a5-c2ba-46ab-a833-36ccea778f84","Type":"ContainerStarted","Data":"91df1f0d5121602b9fe59a04332413023ebf6d5a457db6ee773c7672aa055ca8"} Nov 29 05:46:57 crc kubenswrapper[4594]: I1129 05:46:57.704924 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfb657a5-c2ba-46ab-a833-36ccea778f84","Type":"ContainerStarted","Data":"cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab"} Nov 29 05:46:57 crc kubenswrapper[4594]: I1129 05:46:57.727839 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.727815573 podStartE2EDuration="2.727815573s" podCreationTimestamp="2025-11-29 05:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:46:57.724959081 +0000 UTC m=+1141.965468301" watchObservedRunningTime="2025-11-29 05:46:57.727815573 +0000 UTC m=+1141.968324784" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.044963 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.170009 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-config-data\") pod \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.170089 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pc54\" (UniqueName: \"kubernetes.io/projected/19e2ee6f-48e6-470c-84c9-6b626fdefef7-kube-api-access-7pc54\") pod \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.170290 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-combined-ca-bundle\") pod \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.170409 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-scripts\") pod \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.178002 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e2ee6f-48e6-470c-84c9-6b626fdefef7-kube-api-access-7pc54" (OuterVolumeSpecName: "kube-api-access-7pc54") pod "19e2ee6f-48e6-470c-84c9-6b626fdefef7" (UID: "19e2ee6f-48e6-470c-84c9-6b626fdefef7"). InnerVolumeSpecName "kube-api-access-7pc54". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.178053 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-scripts" (OuterVolumeSpecName: "scripts") pod "19e2ee6f-48e6-470c-84c9-6b626fdefef7" (UID: "19e2ee6f-48e6-470c-84c9-6b626fdefef7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:58 crc kubenswrapper[4594]: E1129 05:46:58.198668 4594 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-combined-ca-bundle podName:19e2ee6f-48e6-470c-84c9-6b626fdefef7 nodeName:}" failed. No retries permitted until 2025-11-29 05:46:58.698634606 +0000 UTC m=+1142.939143826 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-combined-ca-bundle") pod "19e2ee6f-48e6-470c-84c9-6b626fdefef7" (UID: "19e2ee6f-48e6-470c-84c9-6b626fdefef7") : error deleting /var/lib/kubelet/pods/19e2ee6f-48e6-470c-84c9-6b626fdefef7/volume-subpaths: remove /var/lib/kubelet/pods/19e2ee6f-48e6-470c-84c9-6b626fdefef7/volume-subpaths: no such file or directory Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.201236 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-config-data" (OuterVolumeSpecName: "config-data") pod "19e2ee6f-48e6-470c-84c9-6b626fdefef7" (UID: "19e2ee6f-48e6-470c-84c9-6b626fdefef7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.273830 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.273946 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.274017 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pc54\" (UniqueName: \"kubernetes.io/projected/19e2ee6f-48e6-470c-84c9-6b626fdefef7-kube-api-access-7pc54\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.717136 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7mb5g" event={"ID":"19e2ee6f-48e6-470c-84c9-6b626fdefef7","Type":"ContainerDied","Data":"8c238e3701278ef0bd9f3f081dfce2dbfa8876d0722feba3ae7de600e25f6de1"} Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.717199 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c238e3701278ef0bd9f3f081dfce2dbfa8876d0722feba3ae7de600e25f6de1" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.717198 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7mb5g" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.719765 4594 generic.go:334] "Generic (PLEG): container finished" podID="b3e5dae6-ceb8-4878-a37d-56f97d59d103" containerID="256afc9e6233f981bc8a70f9cb1c17c099179d185e2c26760218524be5c1591d" exitCode=0 Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.719864 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cxpv7" event={"ID":"b3e5dae6-ceb8-4878-a37d-56f97d59d103","Type":"ContainerDied","Data":"256afc9e6233f981bc8a70f9cb1c17c099179d185e2c26760218524be5c1591d"} Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.774659 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 05:46:58 crc kubenswrapper[4594]: E1129 05:46:58.775521 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e2ee6f-48e6-470c-84c9-6b626fdefef7" containerName="nova-cell1-conductor-db-sync" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.775543 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e2ee6f-48e6-470c-84c9-6b626fdefef7" containerName="nova-cell1-conductor-db-sync" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.775840 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e2ee6f-48e6-470c-84c9-6b626fdefef7" containerName="nova-cell1-conductor-db-sync" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.776646 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.784119 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-combined-ca-bundle\") pod \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\" (UID: \"19e2ee6f-48e6-470c-84c9-6b626fdefef7\") " Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.788455 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19e2ee6f-48e6-470c-84c9-6b626fdefef7" (UID: "19e2ee6f-48e6-470c-84c9-6b626fdefef7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.789518 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.887557 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n582n\" (UniqueName: \"kubernetes.io/projected/f69f0fb6-b307-4c01-a90e-edf23e3858e1-kube-api-access-n582n\") pod \"nova-cell1-conductor-0\" (UID: \"f69f0fb6-b307-4c01-a90e-edf23e3858e1\") " pod="openstack/nova-cell1-conductor-0" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.887653 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69f0fb6-b307-4c01-a90e-edf23e3858e1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f69f0fb6-b307-4c01-a90e-edf23e3858e1\") " pod="openstack/nova-cell1-conductor-0" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.887745 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f69f0fb6-b307-4c01-a90e-edf23e3858e1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f69f0fb6-b307-4c01-a90e-edf23e3858e1\") " pod="openstack/nova-cell1-conductor-0" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.887865 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e2ee6f-48e6-470c-84c9-6b626fdefef7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.990318 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f69f0fb6-b307-4c01-a90e-edf23e3858e1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f69f0fb6-b307-4c01-a90e-edf23e3858e1\") " pod="openstack/nova-cell1-conductor-0" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.990479 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n582n\" (UniqueName: \"kubernetes.io/projected/f69f0fb6-b307-4c01-a90e-edf23e3858e1-kube-api-access-n582n\") pod \"nova-cell1-conductor-0\" (UID: \"f69f0fb6-b307-4c01-a90e-edf23e3858e1\") " pod="openstack/nova-cell1-conductor-0" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.990528 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69f0fb6-b307-4c01-a90e-edf23e3858e1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f69f0fb6-b307-4c01-a90e-edf23e3858e1\") " pod="openstack/nova-cell1-conductor-0" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.994622 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f69f0fb6-b307-4c01-a90e-edf23e3858e1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f69f0fb6-b307-4c01-a90e-edf23e3858e1\") " pod="openstack/nova-cell1-conductor-0" Nov 29 05:46:58 crc kubenswrapper[4594]: I1129 05:46:58.994984 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69f0fb6-b307-4c01-a90e-edf23e3858e1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f69f0fb6-b307-4c01-a90e-edf23e3858e1\") " pod="openstack/nova-cell1-conductor-0" Nov 29 05:46:59 crc kubenswrapper[4594]: I1129 05:46:59.006829 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n582n\" (UniqueName: \"kubernetes.io/projected/f69f0fb6-b307-4c01-a90e-edf23e3858e1-kube-api-access-n582n\") pod \"nova-cell1-conductor-0\" (UID: \"f69f0fb6-b307-4c01-a90e-edf23e3858e1\") " pod="openstack/nova-cell1-conductor-0" Nov 29 05:46:59 crc kubenswrapper[4594]: I1129 05:46:59.132052 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 05:46:59 crc kubenswrapper[4594]: I1129 05:46:59.558952 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 05:46:59 crc kubenswrapper[4594]: W1129 05:46:59.560525 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf69f0fb6_b307_4c01_a90e_edf23e3858e1.slice/crio-aaa98a507098ad7adb6c419fdf44531f2b558c94077df3cba5ee1fa6b162823d WatchSource:0}: Error finding container aaa98a507098ad7adb6c419fdf44531f2b558c94077df3cba5ee1fa6b162823d: Status 404 returned error can't find the container with id aaa98a507098ad7adb6c419fdf44531f2b558c94077df3cba5ee1fa6b162823d Nov 29 05:46:59 crc kubenswrapper[4594]: I1129 05:46:59.735307 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f69f0fb6-b307-4c01-a90e-edf23e3858e1","Type":"ContainerStarted","Data":"aaa98a507098ad7adb6c419fdf44531f2b558c94077df3cba5ee1fa6b162823d"} Nov 29 05:46:59 crc kubenswrapper[4594]: I1129 05:46:59.981219 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.067976 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.068059 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.099560 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.120498 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.121073 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-scripts\") pod \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.121152 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-config-data\") pod \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.121281 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-combined-ca-bundle\") pod \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.121378 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvhlf\" (UniqueName: \"kubernetes.io/projected/b3e5dae6-ceb8-4878-a37d-56f97d59d103-kube-api-access-cvhlf\") pod \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\" (UID: \"b3e5dae6-ceb8-4878-a37d-56f97d59d103\") " Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.129396 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e5dae6-ceb8-4878-a37d-56f97d59d103-kube-api-access-cvhlf" (OuterVolumeSpecName: "kube-api-access-cvhlf") pod "b3e5dae6-ceb8-4878-a37d-56f97d59d103" (UID: "b3e5dae6-ceb8-4878-a37d-56f97d59d103"). InnerVolumeSpecName "kube-api-access-cvhlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.130508 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-scripts" (OuterVolumeSpecName: "scripts") pod "b3e5dae6-ceb8-4878-a37d-56f97d59d103" (UID: "b3e5dae6-ceb8-4878-a37d-56f97d59d103"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.153559 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3e5dae6-ceb8-4878-a37d-56f97d59d103" (UID: "b3e5dae6-ceb8-4878-a37d-56f97d59d103"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.156692 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-config-data" (OuterVolumeSpecName: "config-data") pod "b3e5dae6-ceb8-4878-a37d-56f97d59d103" (UID: "b3e5dae6-ceb8-4878-a37d-56f97d59d103"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.225542 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvhlf\" (UniqueName: \"kubernetes.io/projected/b3e5dae6-ceb8-4878-a37d-56f97d59d103-kube-api-access-cvhlf\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.225575 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.225586 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.225599 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e5dae6-ceb8-4878-a37d-56f97d59d103-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.295537 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.383290 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-4tc8j"] Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.601726 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.761870 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f69f0fb6-b307-4c01-a90e-edf23e3858e1","Type":"ContainerStarted","Data":"a10bb1f5e218779ac32d7e5ca1f585c657c262e63591a73ed98d992a46d2c449"} Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.763429 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.771308 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cxpv7" event={"ID":"b3e5dae6-ceb8-4878-a37d-56f97d59d103","Type":"ContainerDied","Data":"0028ad06bd64594786fe6c9d23b77762582f440e53d5823952dc16080c073b87"} Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.771343 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0028ad06bd64594786fe6c9d23b77762582f440e53d5823952dc16080c073b87" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.771392 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cxpv7" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.772142 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75958fc765-4tc8j" podUID="1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" containerName="dnsmasq-dns" containerID="cri-o://a3e0c234e95de47b62615205f877e5484af6665111fbf38df64e2dc1f3b3e095" gracePeriod=10 Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.796454 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.7964367020000003 podStartE2EDuration="2.796436702s" podCreationTimestamp="2025-11-29 05:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:47:00.787426786 +0000 UTC m=+1145.027936006" watchObservedRunningTime="2025-11-29 05:47:00.796436702 +0000 UTC m=+1145.036945922" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.838784 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.840311 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.840628 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="602dee9d-18e0-42f2-b23f-e74cc723e9ed" containerName="nova-api-log" containerID="cri-o://cc22d717976dd8e182efc2f2d868ec5d33b54f2513be104be0c9a8008f68084c" gracePeriod=30 Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.840880 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="602dee9d-18e0-42f2-b23f-e74cc723e9ed" containerName="nova-api-api" containerID="cri-o://a37b4498c59965f20f7818ac4d7defc36702a27dc00081b5a8549a86e3c067bb" gracePeriod=30 Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.852560 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="602dee9d-18e0-42f2-b23f-e74cc723e9ed" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": EOF" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.852625 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="602dee9d-18e0-42f2-b23f-e74cc723e9ed" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": EOF" Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.861688 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.873675 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.873925 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cfb657a5-c2ba-46ab-a833-36ccea778f84" containerName="nova-metadata-log" containerID="cri-o://bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6" gracePeriod=30 Nov 29 05:47:00 crc kubenswrapper[4594]: I1129 05:47:00.874074 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cfb657a5-c2ba-46ab-a833-36ccea778f84" containerName="nova-metadata-metadata" containerID="cri-o://cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab" gracePeriod=30 Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.089019 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.093033 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.252687 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.256238 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-dns-swift-storage-0\") pod \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.256346 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-config\") pod \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.256385 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-ovsdbserver-nb\") pod \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.256429 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-ovsdbserver-sb\") pod \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.256453 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzxsl\" (UniqueName: \"kubernetes.io/projected/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-kube-api-access-zzxsl\") pod \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.256518 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-dns-svc\") pod \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\" (UID: \"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41\") " Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.296584 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-kube-api-access-zzxsl" (OuterVolumeSpecName: "kube-api-access-zzxsl") pod "1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" (UID: "1cf50aee-5c49-4d8b-9fd9-8b7afd376f41"). InnerVolumeSpecName "kube-api-access-zzxsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.359218 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzxsl\" (UniqueName: \"kubernetes.io/projected/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-kube-api-access-zzxsl\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.363523 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" (UID: "1cf50aee-5c49-4d8b-9fd9-8b7afd376f41"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.387817 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" (UID: "1cf50aee-5c49-4d8b-9fd9-8b7afd376f41"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.393079 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" (UID: "1cf50aee-5c49-4d8b-9fd9-8b7afd376f41"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.395232 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-config" (OuterVolumeSpecName: "config") pod "1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" (UID: "1cf50aee-5c49-4d8b-9fd9-8b7afd376f41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.419515 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" (UID: "1cf50aee-5c49-4d8b-9fd9-8b7afd376f41"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.420503 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.492610 4594 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.492680 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.492737 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.492757 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.492804 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.594321 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-combined-ca-bundle\") pod \"cfb657a5-c2ba-46ab-a833-36ccea778f84\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.594403 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwcdl\" (UniqueName: \"kubernetes.io/projected/cfb657a5-c2ba-46ab-a833-36ccea778f84-kube-api-access-gwcdl\") pod \"cfb657a5-c2ba-46ab-a833-36ccea778f84\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.594550 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb657a5-c2ba-46ab-a833-36ccea778f84-logs\") pod \"cfb657a5-c2ba-46ab-a833-36ccea778f84\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.594655 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-config-data\") pod \"cfb657a5-c2ba-46ab-a833-36ccea778f84\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.594859 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-nova-metadata-tls-certs\") pod \"cfb657a5-c2ba-46ab-a833-36ccea778f84\" (UID: \"cfb657a5-c2ba-46ab-a833-36ccea778f84\") " Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.595340 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb657a5-c2ba-46ab-a833-36ccea778f84-logs" (OuterVolumeSpecName: "logs") pod "cfb657a5-c2ba-46ab-a833-36ccea778f84" (UID: "cfb657a5-c2ba-46ab-a833-36ccea778f84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.596106 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb657a5-c2ba-46ab-a833-36ccea778f84-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.602599 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb657a5-c2ba-46ab-a833-36ccea778f84-kube-api-access-gwcdl" (OuterVolumeSpecName: "kube-api-access-gwcdl") pod "cfb657a5-c2ba-46ab-a833-36ccea778f84" (UID: "cfb657a5-c2ba-46ab-a833-36ccea778f84"). InnerVolumeSpecName "kube-api-access-gwcdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.625752 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-config-data" (OuterVolumeSpecName: "config-data") pod "cfb657a5-c2ba-46ab-a833-36ccea778f84" (UID: "cfb657a5-c2ba-46ab-a833-36ccea778f84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.628157 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfb657a5-c2ba-46ab-a833-36ccea778f84" (UID: "cfb657a5-c2ba-46ab-a833-36ccea778f84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.650544 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cfb657a5-c2ba-46ab-a833-36ccea778f84" (UID: "cfb657a5-c2ba-46ab-a833-36ccea778f84"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.698430 4594 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.698474 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.698489 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwcdl\" (UniqueName: \"kubernetes.io/projected/cfb657a5-c2ba-46ab-a833-36ccea778f84-kube-api-access-gwcdl\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.698502 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb657a5-c2ba-46ab-a833-36ccea778f84-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.784788 4594 generic.go:334] "Generic (PLEG): container finished" podID="1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" containerID="a3e0c234e95de47b62615205f877e5484af6665111fbf38df64e2dc1f3b3e095" exitCode=0 Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.784965 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-4tc8j" event={"ID":"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41","Type":"ContainerDied","Data":"a3e0c234e95de47b62615205f877e5484af6665111fbf38df64e2dc1f3b3e095"} Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.785050 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-4tc8j" event={"ID":"1cf50aee-5c49-4d8b-9fd9-8b7afd376f41","Type":"ContainerDied","Data":"5cefd7ae7bc42d9e45e7828df0cf6c9feea056c97e2f31c8268febef3b7503fd"} Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.785073 4594 scope.go:117] "RemoveContainer" containerID="a3e0c234e95de47b62615205f877e5484af6665111fbf38df64e2dc1f3b3e095" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.785862 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75958fc765-4tc8j" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.795893 4594 generic.go:334] "Generic (PLEG): container finished" podID="cfb657a5-c2ba-46ab-a833-36ccea778f84" containerID="cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab" exitCode=0 Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.795986 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.796029 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfb657a5-c2ba-46ab-a833-36ccea778f84","Type":"ContainerDied","Data":"cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab"} Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.796132 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfb657a5-c2ba-46ab-a833-36ccea778f84","Type":"ContainerDied","Data":"bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6"} Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.795994 4594 generic.go:334] "Generic (PLEG): container finished" podID="cfb657a5-c2ba-46ab-a833-36ccea778f84" containerID="bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6" exitCode=143 Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.796351 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfb657a5-c2ba-46ab-a833-36ccea778f84","Type":"ContainerDied","Data":"91df1f0d5121602b9fe59a04332413023ebf6d5a457db6ee773c7672aa055ca8"} Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.801401 4594 generic.go:334] "Generic (PLEG): container finished" podID="602dee9d-18e0-42f2-b23f-e74cc723e9ed" containerID="cc22d717976dd8e182efc2f2d868ec5d33b54f2513be104be0c9a8008f68084c" exitCode=143 Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.801621 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"602dee9d-18e0-42f2-b23f-e74cc723e9ed","Type":"ContainerDied","Data":"cc22d717976dd8e182efc2f2d868ec5d33b54f2513be104be0c9a8008f68084c"} Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.832224 4594 scope.go:117] "RemoveContainer" containerID="170ac117cdf78269bcb9a2c3c6edbba842e165a1dd5f2c25b3de78a089bb7197" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.840826 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-4tc8j"] Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.856152 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-4tc8j"] Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.870008 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.871527 4594 scope.go:117] "RemoveContainer" containerID="a3e0c234e95de47b62615205f877e5484af6665111fbf38df64e2dc1f3b3e095" Nov 29 05:47:01 crc kubenswrapper[4594]: E1129 05:47:01.871951 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3e0c234e95de47b62615205f877e5484af6665111fbf38df64e2dc1f3b3e095\": container with ID starting with a3e0c234e95de47b62615205f877e5484af6665111fbf38df64e2dc1f3b3e095 not found: ID does not exist" containerID="a3e0c234e95de47b62615205f877e5484af6665111fbf38df64e2dc1f3b3e095" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.871998 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e0c234e95de47b62615205f877e5484af6665111fbf38df64e2dc1f3b3e095"} err="failed to get container status \"a3e0c234e95de47b62615205f877e5484af6665111fbf38df64e2dc1f3b3e095\": rpc error: code = NotFound desc = could not find container \"a3e0c234e95de47b62615205f877e5484af6665111fbf38df64e2dc1f3b3e095\": container with ID starting with a3e0c234e95de47b62615205f877e5484af6665111fbf38df64e2dc1f3b3e095 not found: ID does not exist" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.872036 4594 scope.go:117] "RemoveContainer" containerID="170ac117cdf78269bcb9a2c3c6edbba842e165a1dd5f2c25b3de78a089bb7197" Nov 29 05:47:01 crc kubenswrapper[4594]: E1129 05:47:01.872677 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"170ac117cdf78269bcb9a2c3c6edbba842e165a1dd5f2c25b3de78a089bb7197\": container with ID starting with 170ac117cdf78269bcb9a2c3c6edbba842e165a1dd5f2c25b3de78a089bb7197 not found: ID does not exist" containerID="170ac117cdf78269bcb9a2c3c6edbba842e165a1dd5f2c25b3de78a089bb7197" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.872705 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170ac117cdf78269bcb9a2c3c6edbba842e165a1dd5f2c25b3de78a089bb7197"} err="failed to get container status \"170ac117cdf78269bcb9a2c3c6edbba842e165a1dd5f2c25b3de78a089bb7197\": rpc error: code = NotFound desc = could not find container \"170ac117cdf78269bcb9a2c3c6edbba842e165a1dd5f2c25b3de78a089bb7197\": container with ID starting with 170ac117cdf78269bcb9a2c3c6edbba842e165a1dd5f2c25b3de78a089bb7197 not found: ID does not exist" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.872722 4594 scope.go:117] "RemoveContainer" containerID="cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.877097 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.887692 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:47:01 crc kubenswrapper[4594]: E1129 05:47:01.888177 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb657a5-c2ba-46ab-a833-36ccea778f84" containerName="nova-metadata-log" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.888199 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb657a5-c2ba-46ab-a833-36ccea778f84" containerName="nova-metadata-log" Nov 29 05:47:01 crc kubenswrapper[4594]: E1129 05:47:01.888214 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" containerName="init" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.888220 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" containerName="init" Nov 29 05:47:01 crc kubenswrapper[4594]: E1129 05:47:01.888267 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" containerName="dnsmasq-dns" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.888274 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" containerName="dnsmasq-dns" Nov 29 05:47:01 crc kubenswrapper[4594]: E1129 05:47:01.888292 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e5dae6-ceb8-4878-a37d-56f97d59d103" containerName="nova-manage" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.888297 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e5dae6-ceb8-4878-a37d-56f97d59d103" containerName="nova-manage" Nov 29 05:47:01 crc kubenswrapper[4594]: E1129 05:47:01.888309 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb657a5-c2ba-46ab-a833-36ccea778f84" containerName="nova-metadata-metadata" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.888316 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb657a5-c2ba-46ab-a833-36ccea778f84" containerName="nova-metadata-metadata" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.888526 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" containerName="dnsmasq-dns" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.888545 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3e5dae6-ceb8-4878-a37d-56f97d59d103" containerName="nova-manage" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.888555 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb657a5-c2ba-46ab-a833-36ccea778f84" containerName="nova-metadata-log" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.888574 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb657a5-c2ba-46ab-a833-36ccea778f84" containerName="nova-metadata-metadata" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.889752 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.891847 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.893117 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.893884 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.905429 4594 scope.go:117] "RemoveContainer" containerID="bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.906728 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241daf99-de92-4580-bec5-138d9356b784-logs\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.906942 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.906984 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-config-data\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.907028 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5msjf\" (UniqueName: \"kubernetes.io/projected/241daf99-de92-4580-bec5-138d9356b784-kube-api-access-5msjf\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.907075 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.927707 4594 scope.go:117] "RemoveContainer" containerID="cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab" Nov 29 05:47:01 crc kubenswrapper[4594]: E1129 05:47:01.928209 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab\": container with ID starting with cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab not found: ID does not exist" containerID="cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.928472 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab"} err="failed to get container status \"cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab\": rpc error: code = NotFound desc = could not find container \"cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab\": container with ID starting with cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab not found: ID does not exist" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.928559 4594 scope.go:117] "RemoveContainer" containerID="bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6" Nov 29 05:47:01 crc kubenswrapper[4594]: E1129 05:47:01.929164 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6\": container with ID starting with bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6 not found: ID does not exist" containerID="bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.929203 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6"} err="failed to get container status \"bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6\": rpc error: code = NotFound desc = could not find container \"bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6\": container with ID starting with bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6 not found: ID does not exist" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.929227 4594 scope.go:117] "RemoveContainer" containerID="cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.929548 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab"} err="failed to get container status \"cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab\": rpc error: code = NotFound desc = could not find container \"cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab\": container with ID starting with cea93a435bee808728ae36a37d5a424e9120ad774bd2f297859047c2fb4c6aab not found: ID does not exist" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.929630 4594 scope.go:117] "RemoveContainer" containerID="bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6" Nov 29 05:47:01 crc kubenswrapper[4594]: I1129 05:47:01.930385 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6"} err="failed to get container status \"bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6\": rpc error: code = NotFound desc = could not find container \"bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6\": container with ID starting with bf327543ad1191bbc7eda5715f7fea88731c5b80c9cf3018bf92b993a15ad5c6 not found: ID does not exist" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.010266 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5msjf\" (UniqueName: \"kubernetes.io/projected/241daf99-de92-4580-bec5-138d9356b784-kube-api-access-5msjf\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.010589 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.010699 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241daf99-de92-4580-bec5-138d9356b784-logs\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.010781 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.010821 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-config-data\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.012779 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241daf99-de92-4580-bec5-138d9356b784-logs\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.017058 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-config-data\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.018858 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.019443 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.058786 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5msjf\" (UniqueName: \"kubernetes.io/projected/241daf99-de92-4580-bec5-138d9356b784-kube-api-access-5msjf\") pod \"nova-metadata-0\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " pod="openstack/nova-metadata-0" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.097312 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cf50aee-5c49-4d8b-9fd9-8b7afd376f41" path="/var/lib/kubelet/pods/1cf50aee-5c49-4d8b-9fd9-8b7afd376f41/volumes" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.098234 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb657a5-c2ba-46ab-a833-36ccea778f84" path="/var/lib/kubelet/pods/cfb657a5-c2ba-46ab-a833-36ccea778f84/volumes" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.212926 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.671858 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.866303 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"241daf99-de92-4580-bec5-138d9356b784","Type":"ContainerStarted","Data":"3427377646fe8af6965886edf00e3197e65ce8fd41000e0d76ffb3f74835e053"} Nov 29 05:47:02 crc kubenswrapper[4594]: I1129 05:47:02.866484 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fcdb07b4-07da-4f88-acb5-4e28b684cb22" containerName="nova-scheduler-scheduler" containerID="cri-o://aba935b28d75cc52ee12171165bc73ed35d1909899b2a71ec23d305ccf8ff520" gracePeriod=30 Nov 29 05:47:03 crc kubenswrapper[4594]: I1129 05:47:03.876544 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"241daf99-de92-4580-bec5-138d9356b784","Type":"ContainerStarted","Data":"9f48c153d0e592cdd9839d42d29322fa2a29306b09ab03122a334e58a2841e1f"} Nov 29 05:47:03 crc kubenswrapper[4594]: I1129 05:47:03.876854 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"241daf99-de92-4580-bec5-138d9356b784","Type":"ContainerStarted","Data":"e9f13975e833b1e2021391fb9c17d6e7a6d0ede6b1bbc0a86daeebe15b1087f4"} Nov 29 05:47:03 crc kubenswrapper[4594]: I1129 05:47:03.901134 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.901111809 podStartE2EDuration="2.901111809s" podCreationTimestamp="2025-11-29 05:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:47:03.894761745 +0000 UTC m=+1148.135270965" watchObservedRunningTime="2025-11-29 05:47:03.901111809 +0000 UTC m=+1148.141621018" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.170053 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.424623 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.573549 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602dee9d-18e0-42f2-b23f-e74cc723e9ed-config-data\") pod \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.573846 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/602dee9d-18e0-42f2-b23f-e74cc723e9ed-logs\") pod \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.573910 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602dee9d-18e0-42f2-b23f-e74cc723e9ed-combined-ca-bundle\") pod \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.574098 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfvtg\" (UniqueName: \"kubernetes.io/projected/602dee9d-18e0-42f2-b23f-e74cc723e9ed-kube-api-access-cfvtg\") pod \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\" (UID: \"602dee9d-18e0-42f2-b23f-e74cc723e9ed\") " Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.574353 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/602dee9d-18e0-42f2-b23f-e74cc723e9ed-logs" (OuterVolumeSpecName: "logs") pod "602dee9d-18e0-42f2-b23f-e74cc723e9ed" (UID: "602dee9d-18e0-42f2-b23f-e74cc723e9ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.574680 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/602dee9d-18e0-42f2-b23f-e74cc723e9ed-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.579831 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/602dee9d-18e0-42f2-b23f-e74cc723e9ed-kube-api-access-cfvtg" (OuterVolumeSpecName: "kube-api-access-cfvtg") pod "602dee9d-18e0-42f2-b23f-e74cc723e9ed" (UID: "602dee9d-18e0-42f2-b23f-e74cc723e9ed"). InnerVolumeSpecName "kube-api-access-cfvtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.602868 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/602dee9d-18e0-42f2-b23f-e74cc723e9ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "602dee9d-18e0-42f2-b23f-e74cc723e9ed" (UID: "602dee9d-18e0-42f2-b23f-e74cc723e9ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.606567 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/602dee9d-18e0-42f2-b23f-e74cc723e9ed-config-data" (OuterVolumeSpecName: "config-data") pod "602dee9d-18e0-42f2-b23f-e74cc723e9ed" (UID: "602dee9d-18e0-42f2-b23f-e74cc723e9ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.645085 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.645345 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="9c233f96-4ed2-4e1f-b408-1b75092f366a" containerName="kube-state-metrics" containerID="cri-o://052d685992b51e74e61432ec2ec4b5534a4822ad4dfc532dacba3735a6fdb5e3" gracePeriod=30 Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.677053 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602dee9d-18e0-42f2-b23f-e74cc723e9ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.677088 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfvtg\" (UniqueName: \"kubernetes.io/projected/602dee9d-18e0-42f2-b23f-e74cc723e9ed-kube-api-access-cfvtg\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.677100 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602dee9d-18e0-42f2-b23f-e74cc723e9ed-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.885662 4594 generic.go:334] "Generic (PLEG): container finished" podID="602dee9d-18e0-42f2-b23f-e74cc723e9ed" containerID="a37b4498c59965f20f7818ac4d7defc36702a27dc00081b5a8549a86e3c067bb" exitCode=0 Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.885723 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.885727 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"602dee9d-18e0-42f2-b23f-e74cc723e9ed","Type":"ContainerDied","Data":"a37b4498c59965f20f7818ac4d7defc36702a27dc00081b5a8549a86e3c067bb"} Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.885789 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"602dee9d-18e0-42f2-b23f-e74cc723e9ed","Type":"ContainerDied","Data":"776caf6b0b2609b3e203dc35b6b40e098ac043443f5f2a03ddb800fde47b38a7"} Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.885817 4594 scope.go:117] "RemoveContainer" containerID="a37b4498c59965f20f7818ac4d7defc36702a27dc00081b5a8549a86e3c067bb" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.888691 4594 generic.go:334] "Generic (PLEG): container finished" podID="9c233f96-4ed2-4e1f-b408-1b75092f366a" containerID="052d685992b51e74e61432ec2ec4b5534a4822ad4dfc532dacba3735a6fdb5e3" exitCode=2 Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.888722 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9c233f96-4ed2-4e1f-b408-1b75092f366a","Type":"ContainerDied","Data":"052d685992b51e74e61432ec2ec4b5534a4822ad4dfc532dacba3735a6fdb5e3"} Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.913694 4594 scope.go:117] "RemoveContainer" containerID="cc22d717976dd8e182efc2f2d868ec5d33b54f2513be104be0c9a8008f68084c" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.939566 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.947575 4594 scope.go:117] "RemoveContainer" containerID="a37b4498c59965f20f7818ac4d7defc36702a27dc00081b5a8549a86e3c067bb" Nov 29 05:47:04 crc kubenswrapper[4594]: E1129 05:47:04.948071 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a37b4498c59965f20f7818ac4d7defc36702a27dc00081b5a8549a86e3c067bb\": container with ID starting with a37b4498c59965f20f7818ac4d7defc36702a27dc00081b5a8549a86e3c067bb not found: ID does not exist" containerID="a37b4498c59965f20f7818ac4d7defc36702a27dc00081b5a8549a86e3c067bb" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.948109 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a37b4498c59965f20f7818ac4d7defc36702a27dc00081b5a8549a86e3c067bb"} err="failed to get container status \"a37b4498c59965f20f7818ac4d7defc36702a27dc00081b5a8549a86e3c067bb\": rpc error: code = NotFound desc = could not find container \"a37b4498c59965f20f7818ac4d7defc36702a27dc00081b5a8549a86e3c067bb\": container with ID starting with a37b4498c59965f20f7818ac4d7defc36702a27dc00081b5a8549a86e3c067bb not found: ID does not exist" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.948136 4594 scope.go:117] "RemoveContainer" containerID="cc22d717976dd8e182efc2f2d868ec5d33b54f2513be104be0c9a8008f68084c" Nov 29 05:47:04 crc kubenswrapper[4594]: E1129 05:47:04.948587 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc22d717976dd8e182efc2f2d868ec5d33b54f2513be104be0c9a8008f68084c\": container with ID starting with cc22d717976dd8e182efc2f2d868ec5d33b54f2513be104be0c9a8008f68084c not found: ID does not exist" containerID="cc22d717976dd8e182efc2f2d868ec5d33b54f2513be104be0c9a8008f68084c" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.948620 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc22d717976dd8e182efc2f2d868ec5d33b54f2513be104be0c9a8008f68084c"} err="failed to get container status \"cc22d717976dd8e182efc2f2d868ec5d33b54f2513be104be0c9a8008f68084c\": rpc error: code = NotFound desc = could not find container \"cc22d717976dd8e182efc2f2d868ec5d33b54f2513be104be0c9a8008f68084c\": container with ID starting with cc22d717976dd8e182efc2f2d868ec5d33b54f2513be104be0c9a8008f68084c not found: ID does not exist" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.948644 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.974642 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:04 crc kubenswrapper[4594]: E1129 05:47:04.975421 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="602dee9d-18e0-42f2-b23f-e74cc723e9ed" containerName="nova-api-log" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.975434 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="602dee9d-18e0-42f2-b23f-e74cc723e9ed" containerName="nova-api-log" Nov 29 05:47:04 crc kubenswrapper[4594]: E1129 05:47:04.975461 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="602dee9d-18e0-42f2-b23f-e74cc723e9ed" containerName="nova-api-api" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.975467 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="602dee9d-18e0-42f2-b23f-e74cc723e9ed" containerName="nova-api-api" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.975701 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="602dee9d-18e0-42f2-b23f-e74cc723e9ed" containerName="nova-api-log" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.975711 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="602dee9d-18e0-42f2-b23f-e74cc723e9ed" containerName="nova-api-api" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.976880 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.979544 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 05:47:04 crc kubenswrapper[4594]: I1129 05:47:04.989586 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.084819 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gsjn\" (UniqueName: \"kubernetes.io/projected/5e95c8b5-5c50-40fb-8e08-62364581c443-kube-api-access-2gsjn\") pod \"nova-api-0\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " pod="openstack/nova-api-0" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.085031 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e95c8b5-5c50-40fb-8e08-62364581c443-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " pod="openstack/nova-api-0" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.085070 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e95c8b5-5c50-40fb-8e08-62364581c443-logs\") pod \"nova-api-0\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " pod="openstack/nova-api-0" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.085155 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e95c8b5-5c50-40fb-8e08-62364581c443-config-data\") pod \"nova-api-0\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " pod="openstack/nova-api-0" Nov 29 05:47:05 crc kubenswrapper[4594]: E1129 05:47:05.089002 4594 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aba935b28d75cc52ee12171165bc73ed35d1909899b2a71ec23d305ccf8ff520" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 05:47:05 crc kubenswrapper[4594]: E1129 05:47:05.090830 4594 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aba935b28d75cc52ee12171165bc73ed35d1909899b2a71ec23d305ccf8ff520" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 05:47:05 crc kubenswrapper[4594]: E1129 05:47:05.092283 4594 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aba935b28d75cc52ee12171165bc73ed35d1909899b2a71ec23d305ccf8ff520" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 05:47:05 crc kubenswrapper[4594]: E1129 05:47:05.092318 4594 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fcdb07b4-07da-4f88-acb5-4e28b684cb22" containerName="nova-scheduler-scheduler" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.141749 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.188678 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e95c8b5-5c50-40fb-8e08-62364581c443-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " pod="openstack/nova-api-0" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.188732 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e95c8b5-5c50-40fb-8e08-62364581c443-logs\") pod \"nova-api-0\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " pod="openstack/nova-api-0" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.188848 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e95c8b5-5c50-40fb-8e08-62364581c443-config-data\") pod \"nova-api-0\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " pod="openstack/nova-api-0" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.189092 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gsjn\" (UniqueName: \"kubernetes.io/projected/5e95c8b5-5c50-40fb-8e08-62364581c443-kube-api-access-2gsjn\") pod \"nova-api-0\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " pod="openstack/nova-api-0" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.189907 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e95c8b5-5c50-40fb-8e08-62364581c443-logs\") pod \"nova-api-0\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " pod="openstack/nova-api-0" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.193874 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e95c8b5-5c50-40fb-8e08-62364581c443-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " pod="openstack/nova-api-0" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.194388 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e95c8b5-5c50-40fb-8e08-62364581c443-config-data\") pod \"nova-api-0\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " pod="openstack/nova-api-0" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.203678 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gsjn\" (UniqueName: \"kubernetes.io/projected/5e95c8b5-5c50-40fb-8e08-62364581c443-kube-api-access-2gsjn\") pod \"nova-api-0\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " pod="openstack/nova-api-0" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.290647 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wvrb\" (UniqueName: \"kubernetes.io/projected/9c233f96-4ed2-4e1f-b408-1b75092f366a-kube-api-access-8wvrb\") pod \"9c233f96-4ed2-4e1f-b408-1b75092f366a\" (UID: \"9c233f96-4ed2-4e1f-b408-1b75092f366a\") " Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.291949 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.294934 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c233f96-4ed2-4e1f-b408-1b75092f366a-kube-api-access-8wvrb" (OuterVolumeSpecName: "kube-api-access-8wvrb") pod "9c233f96-4ed2-4e1f-b408-1b75092f366a" (UID: "9c233f96-4ed2-4e1f-b408-1b75092f366a"). InnerVolumeSpecName "kube-api-access-8wvrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:05 crc kubenswrapper[4594]: I1129 05:47:05.393960 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wvrb\" (UniqueName: \"kubernetes.io/projected/9c233f96-4ed2-4e1f-b408-1b75092f366a-kube-api-access-8wvrb\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.657495 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.900626 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5e95c8b5-5c50-40fb-8e08-62364581c443","Type":"ContainerStarted","Data":"bbf24758c41d9f0a53285a8a6ce43ca911ea4e425d6c1af057cc06d6e0c9e486"} Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.900841 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5e95c8b5-5c50-40fb-8e08-62364581c443","Type":"ContainerStarted","Data":"eb7aeaa97d61cca22e891792c3fe017fa40ff4423a8dccfbaf8614d58223e908"} Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.903281 4594 generic.go:334] "Generic (PLEG): container finished" podID="fcdb07b4-07da-4f88-acb5-4e28b684cb22" containerID="aba935b28d75cc52ee12171165bc73ed35d1909899b2a71ec23d305ccf8ff520" exitCode=0 Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.903323 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcdb07b4-07da-4f88-acb5-4e28b684cb22","Type":"ContainerDied","Data":"aba935b28d75cc52ee12171165bc73ed35d1909899b2a71ec23d305ccf8ff520"} Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.913592 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9c233f96-4ed2-4e1f-b408-1b75092f366a","Type":"ContainerDied","Data":"48ab3e19937506faa5597efb437e916793575c2a0932d36d83f31c05ebfab741"} Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.913640 4594 scope.go:117] "RemoveContainer" containerID="052d685992b51e74e61432ec2ec4b5534a4822ad4dfc532dacba3735a6fdb5e3" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.913756 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.956483 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.970495 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.980487 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 05:47:06 crc kubenswrapper[4594]: E1129 05:47:05.980930 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c233f96-4ed2-4e1f-b408-1b75092f366a" containerName="kube-state-metrics" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.980946 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c233f96-4ed2-4e1f-b408-1b75092f366a" containerName="kube-state-metrics" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.981153 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c233f96-4ed2-4e1f-b408-1b75092f366a" containerName="kube-state-metrics" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.981909 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.983226 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:05.983871 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.000119 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.105848 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="602dee9d-18e0-42f2-b23f-e74cc723e9ed" path="/var/lib/kubelet/pods/602dee9d-18e0-42f2-b23f-e74cc723e9ed/volumes" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.106499 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c233f96-4ed2-4e1f-b408-1b75092f366a" path="/var/lib/kubelet/pods/9c233f96-4ed2-4e1f-b408-1b75092f366a/volumes" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.114974 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkdmk\" (UniqueName: \"kubernetes.io/projected/6a4026a4-228e-4aa5-be23-c9b7e203c011-kube-api-access-hkdmk\") pod \"kube-state-metrics-0\" (UID: \"6a4026a4-228e-4aa5-be23-c9b7e203c011\") " pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.115068 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6a4026a4-228e-4aa5-be23-c9b7e203c011-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6a4026a4-228e-4aa5-be23-c9b7e203c011\") " pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.115213 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a4026a4-228e-4aa5-be23-c9b7e203c011-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6a4026a4-228e-4aa5-be23-c9b7e203c011\") " pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.115280 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a4026a4-228e-4aa5-be23-c9b7e203c011-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6a4026a4-228e-4aa5-be23-c9b7e203c011\") " pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.219729 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a4026a4-228e-4aa5-be23-c9b7e203c011-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6a4026a4-228e-4aa5-be23-c9b7e203c011\") " pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.220481 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a4026a4-228e-4aa5-be23-c9b7e203c011-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6a4026a4-228e-4aa5-be23-c9b7e203c011\") " pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.220813 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkdmk\" (UniqueName: \"kubernetes.io/projected/6a4026a4-228e-4aa5-be23-c9b7e203c011-kube-api-access-hkdmk\") pod \"kube-state-metrics-0\" (UID: \"6a4026a4-228e-4aa5-be23-c9b7e203c011\") " pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.220895 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6a4026a4-228e-4aa5-be23-c9b7e203c011-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6a4026a4-228e-4aa5-be23-c9b7e203c011\") " pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.225172 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a4026a4-228e-4aa5-be23-c9b7e203c011-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6a4026a4-228e-4aa5-be23-c9b7e203c011\") " pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.226934 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6a4026a4-228e-4aa5-be23-c9b7e203c011-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6a4026a4-228e-4aa5-be23-c9b7e203c011\") " pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.229135 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a4026a4-228e-4aa5-be23-c9b7e203c011-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6a4026a4-228e-4aa5-be23-c9b7e203c011\") " pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.246692 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkdmk\" (UniqueName: \"kubernetes.io/projected/6a4026a4-228e-4aa5-be23-c9b7e203c011-kube-api-access-hkdmk\") pod \"kube-state-metrics-0\" (UID: \"6a4026a4-228e-4aa5-be23-c9b7e203c011\") " pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.302223 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.316152 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.426284 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdb07b4-07da-4f88-acb5-4e28b684cb22-config-data\") pod \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\" (UID: \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\") " Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.426329 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvx6t\" (UniqueName: \"kubernetes.io/projected/fcdb07b4-07da-4f88-acb5-4e28b684cb22-kube-api-access-mvx6t\") pod \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\" (UID: \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\") " Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.426452 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdb07b4-07da-4f88-acb5-4e28b684cb22-combined-ca-bundle\") pod \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\" (UID: \"fcdb07b4-07da-4f88-acb5-4e28b684cb22\") " Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.429664 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcdb07b4-07da-4f88-acb5-4e28b684cb22-kube-api-access-mvx6t" (OuterVolumeSpecName: "kube-api-access-mvx6t") pod "fcdb07b4-07da-4f88-acb5-4e28b684cb22" (UID: "fcdb07b4-07da-4f88-acb5-4e28b684cb22"). InnerVolumeSpecName "kube-api-access-mvx6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.451508 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcdb07b4-07da-4f88-acb5-4e28b684cb22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcdb07b4-07da-4f88-acb5-4e28b684cb22" (UID: "fcdb07b4-07da-4f88-acb5-4e28b684cb22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.459868 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcdb07b4-07da-4f88-acb5-4e28b684cb22-config-data" (OuterVolumeSpecName: "config-data") pod "fcdb07b4-07da-4f88-acb5-4e28b684cb22" (UID: "fcdb07b4-07da-4f88-acb5-4e28b684cb22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.532789 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdb07b4-07da-4f88-acb5-4e28b684cb22-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.533090 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvx6t\" (UniqueName: \"kubernetes.io/projected/fcdb07b4-07da-4f88-acb5-4e28b684cb22-kube-api-access-mvx6t\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.533105 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdb07b4-07da-4f88-acb5-4e28b684cb22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.646589 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.646997 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="ceilometer-central-agent" containerID="cri-o://2ba957e4d359bd316048e313be1e1ab17a747f386fb91d77a2c9b962f262daf3" gracePeriod=30 Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.647708 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="proxy-httpd" containerID="cri-o://9bc3f947b0549f72f61c69217bbb80101779f1d763878071f798b259e2c7b1a1" gracePeriod=30 Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.647790 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="sg-core" containerID="cri-o://2c4814d9889fc0a1908ca851253ea5f09697f1552c79fd3bb12cd5b8980910ad" gracePeriod=30 Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.647855 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="ceilometer-notification-agent" containerID="cri-o://f3e9b7024c68ce7e91300e5ffa7c5e6e23948505fdc56db5f6e3ce5d0348ede5" gracePeriod=30 Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.718645 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 05:47:06 crc kubenswrapper[4594]: W1129 05:47:06.720692 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a4026a4_228e_4aa5_be23_c9b7e203c011.slice/crio-fe9553482ff1360584c2ce35e194ac1f974cbbaef737798afa92ef16ea0f2497 WatchSource:0}: Error finding container fe9553482ff1360584c2ce35e194ac1f974cbbaef737798afa92ef16ea0f2497: Status 404 returned error can't find the container with id fe9553482ff1360584c2ce35e194ac1f974cbbaef737798afa92ef16ea0f2497 Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.927505 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6a4026a4-228e-4aa5-be23-c9b7e203c011","Type":"ContainerStarted","Data":"fe9553482ff1360584c2ce35e194ac1f974cbbaef737798afa92ef16ea0f2497"} Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.932781 4594 generic.go:334] "Generic (PLEG): container finished" podID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerID="9bc3f947b0549f72f61c69217bbb80101779f1d763878071f798b259e2c7b1a1" exitCode=0 Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.932892 4594 generic.go:334] "Generic (PLEG): container finished" podID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerID="2c4814d9889fc0a1908ca851253ea5f09697f1552c79fd3bb12cd5b8980910ad" exitCode=2 Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.932970 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"532aef10-6fe3-4820-b619-f94f07fa5d43","Type":"ContainerDied","Data":"9bc3f947b0549f72f61c69217bbb80101779f1d763878071f798b259e2c7b1a1"} Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.933115 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"532aef10-6fe3-4820-b619-f94f07fa5d43","Type":"ContainerDied","Data":"2c4814d9889fc0a1908ca851253ea5f09697f1552c79fd3bb12cd5b8980910ad"} Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.935916 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5e95c8b5-5c50-40fb-8e08-62364581c443","Type":"ContainerStarted","Data":"129b708614fdc255084b4e1843ede61f086273b325f88b0f4de3146a7b3a15aa"} Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.942027 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcdb07b4-07da-4f88-acb5-4e28b684cb22","Type":"ContainerDied","Data":"6bc8050008b83df69a7fd7ff5cb0f446e3f30d71ed58c59232cd53d0e75e1d5f"} Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.942109 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.942136 4594 scope.go:117] "RemoveContainer" containerID="aba935b28d75cc52ee12171165bc73ed35d1909899b2a71ec23d305ccf8ff520" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.960679 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.960655655 podStartE2EDuration="2.960655655s" podCreationTimestamp="2025-11-29 05:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:47:06.956928284 +0000 UTC m=+1151.197437505" watchObservedRunningTime="2025-11-29 05:47:06.960655655 +0000 UTC m=+1151.201164875" Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.980077 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:47:06 crc kubenswrapper[4594]: I1129 05:47:06.985656 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.012395 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:47:07 crc kubenswrapper[4594]: E1129 05:47:07.012961 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcdb07b4-07da-4f88-acb5-4e28b684cb22" containerName="nova-scheduler-scheduler" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.012981 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcdb07b4-07da-4f88-acb5-4e28b684cb22" containerName="nova-scheduler-scheduler" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.013174 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcdb07b4-07da-4f88-acb5-4e28b684cb22" containerName="nova-scheduler-scheduler" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.014052 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.016793 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.025716 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.045719 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.045824 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jmzk\" (UniqueName: \"kubernetes.io/projected/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-kube-api-access-8jmzk\") pod \"nova-scheduler-0\" (UID: \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.045970 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-config-data\") pod \"nova-scheduler-0\" (UID: \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.148689 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-config-data\") pod \"nova-scheduler-0\" (UID: \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.149058 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.149235 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jmzk\" (UniqueName: \"kubernetes.io/projected/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-kube-api-access-8jmzk\") pod \"nova-scheduler-0\" (UID: \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.153347 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-config-data\") pod \"nova-scheduler-0\" (UID: \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.153597 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.163901 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jmzk\" (UniqueName: \"kubernetes.io/projected/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-kube-api-access-8jmzk\") pod \"nova-scheduler-0\" (UID: \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.213735 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.213899 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.334994 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.776482 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.952050 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e1f7ae3-c18c-4c09-aab4-d8f30450b730","Type":"ContainerStarted","Data":"10ca4ad4b70b3c49c150138aaf360fbe30110a2357b932311894d8989130e937"} Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.952099 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e1f7ae3-c18c-4c09-aab4-d8f30450b730","Type":"ContainerStarted","Data":"b8c7be565aba34d7f932e7eb25603adf96ade2deae8af18e88eaec06207caf0d"} Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.955456 4594 generic.go:334] "Generic (PLEG): container finished" podID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerID="2ba957e4d359bd316048e313be1e1ab17a747f386fb91d77a2c9b962f262daf3" exitCode=0 Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.955535 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"532aef10-6fe3-4820-b619-f94f07fa5d43","Type":"ContainerDied","Data":"2ba957e4d359bd316048e313be1e1ab17a747f386fb91d77a2c9b962f262daf3"} Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.958478 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6a4026a4-228e-4aa5-be23-c9b7e203c011","Type":"ContainerStarted","Data":"4cfdac721e89d65f1b54af296d4ed76c39d48bcff89dc27b7f93c16e5c00350f"} Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.958634 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.978415 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.978402529 podStartE2EDuration="1.978402529s" podCreationTimestamp="2025-11-29 05:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:47:07.965426895 +0000 UTC m=+1152.205936115" watchObservedRunningTime="2025-11-29 05:47:07.978402529 +0000 UTC m=+1152.218911749" Nov 29 05:47:07 crc kubenswrapper[4594]: I1129 05:47:07.986752 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.705611072 podStartE2EDuration="2.986730512s" podCreationTimestamp="2025-11-29 05:47:05 +0000 UTC" firstStartedPulling="2025-11-29 05:47:06.723509938 +0000 UTC m=+1150.964019158" lastFinishedPulling="2025-11-29 05:47:07.004629377 +0000 UTC m=+1151.245138598" observedRunningTime="2025-11-29 05:47:07.983039349 +0000 UTC m=+1152.223548569" watchObservedRunningTime="2025-11-29 05:47:07.986730512 +0000 UTC m=+1152.227239732" Nov 29 05:47:08 crc kubenswrapper[4594]: I1129 05:47:08.097681 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcdb07b4-07da-4f88-acb5-4e28b684cb22" path="/var/lib/kubelet/pods/fcdb07b4-07da-4f88-acb5-4e28b684cb22/volumes" Nov 29 05:47:08 crc kubenswrapper[4594]: I1129 05:47:08.964673 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:47:08 crc kubenswrapper[4594]: I1129 05:47:08.970935 4594 generic.go:334] "Generic (PLEG): container finished" podID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerID="f3e9b7024c68ce7e91300e5ffa7c5e6e23948505fdc56db5f6e3ce5d0348ede5" exitCode=0 Nov 29 05:47:08 crc kubenswrapper[4594]: I1129 05:47:08.971029 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:47:08 crc kubenswrapper[4594]: I1129 05:47:08.971043 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"532aef10-6fe3-4820-b619-f94f07fa5d43","Type":"ContainerDied","Data":"f3e9b7024c68ce7e91300e5ffa7c5e6e23948505fdc56db5f6e3ce5d0348ede5"} Nov 29 05:47:08 crc kubenswrapper[4594]: I1129 05:47:08.971111 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"532aef10-6fe3-4820-b619-f94f07fa5d43","Type":"ContainerDied","Data":"69514fbeccb5dfabbb6b33d200c55dde1fb615cb266eddbf23ccf5b42ccb1382"} Nov 29 05:47:08 crc kubenswrapper[4594]: I1129 05:47:08.971133 4594 scope.go:117] "RemoveContainer" containerID="9bc3f947b0549f72f61c69217bbb80101779f1d763878071f798b259e2c7b1a1" Nov 29 05:47:08 crc kubenswrapper[4594]: I1129 05:47:08.997644 4594 scope.go:117] "RemoveContainer" containerID="2c4814d9889fc0a1908ca851253ea5f09697f1552c79fd3bb12cd5b8980910ad" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.102626 4594 scope.go:117] "RemoveContainer" containerID="f3e9b7024c68ce7e91300e5ffa7c5e6e23948505fdc56db5f6e3ce5d0348ede5" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.103403 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/532aef10-6fe3-4820-b619-f94f07fa5d43-log-httpd\") pod \"532aef10-6fe3-4820-b619-f94f07fa5d43\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.103472 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-sg-core-conf-yaml\") pod \"532aef10-6fe3-4820-b619-f94f07fa5d43\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.103524 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/532aef10-6fe3-4820-b619-f94f07fa5d43-run-httpd\") pod \"532aef10-6fe3-4820-b619-f94f07fa5d43\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.103596 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrcj4\" (UniqueName: \"kubernetes.io/projected/532aef10-6fe3-4820-b619-f94f07fa5d43-kube-api-access-xrcj4\") pod \"532aef10-6fe3-4820-b619-f94f07fa5d43\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.104491 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-config-data\") pod \"532aef10-6fe3-4820-b619-f94f07fa5d43\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.104569 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-combined-ca-bundle\") pod \"532aef10-6fe3-4820-b619-f94f07fa5d43\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.104665 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-scripts\") pod \"532aef10-6fe3-4820-b619-f94f07fa5d43\" (UID: \"532aef10-6fe3-4820-b619-f94f07fa5d43\") " Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.107314 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/532aef10-6fe3-4820-b619-f94f07fa5d43-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "532aef10-6fe3-4820-b619-f94f07fa5d43" (UID: "532aef10-6fe3-4820-b619-f94f07fa5d43"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.119304 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/532aef10-6fe3-4820-b619-f94f07fa5d43-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "532aef10-6fe3-4820-b619-f94f07fa5d43" (UID: "532aef10-6fe3-4820-b619-f94f07fa5d43"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.138683 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-scripts" (OuterVolumeSpecName: "scripts") pod "532aef10-6fe3-4820-b619-f94f07fa5d43" (UID: "532aef10-6fe3-4820-b619-f94f07fa5d43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.148626 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532aef10-6fe3-4820-b619-f94f07fa5d43-kube-api-access-xrcj4" (OuterVolumeSpecName: "kube-api-access-xrcj4") pod "532aef10-6fe3-4820-b619-f94f07fa5d43" (UID: "532aef10-6fe3-4820-b619-f94f07fa5d43"). InnerVolumeSpecName "kube-api-access-xrcj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.175451 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "532aef10-6fe3-4820-b619-f94f07fa5d43" (UID: "532aef10-6fe3-4820-b619-f94f07fa5d43"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.208771 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.210830 4594 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/532aef10-6fe3-4820-b619-f94f07fa5d43-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.210916 4594 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.210971 4594 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/532aef10-6fe3-4820-b619-f94f07fa5d43-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.211034 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrcj4\" (UniqueName: \"kubernetes.io/projected/532aef10-6fe3-4820-b619-f94f07fa5d43-kube-api-access-xrcj4\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.246184 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "532aef10-6fe3-4820-b619-f94f07fa5d43" (UID: "532aef10-6fe3-4820-b619-f94f07fa5d43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.265511 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-config-data" (OuterVolumeSpecName: "config-data") pod "532aef10-6fe3-4820-b619-f94f07fa5d43" (UID: "532aef10-6fe3-4820-b619-f94f07fa5d43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.301866 4594 scope.go:117] "RemoveContainer" containerID="2ba957e4d359bd316048e313be1e1ab17a747f386fb91d77a2c9b962f262daf3" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.307659 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.315476 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.315505 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532aef10-6fe3-4820-b619-f94f07fa5d43-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.327571 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.339564 4594 scope.go:117] "RemoveContainer" containerID="9bc3f947b0549f72f61c69217bbb80101779f1d763878071f798b259e2c7b1a1" Nov 29 05:47:09 crc kubenswrapper[4594]: E1129 05:47:09.340914 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bc3f947b0549f72f61c69217bbb80101779f1d763878071f798b259e2c7b1a1\": container with ID starting with 9bc3f947b0549f72f61c69217bbb80101779f1d763878071f798b259e2c7b1a1 not found: ID does not exist" containerID="9bc3f947b0549f72f61c69217bbb80101779f1d763878071f798b259e2c7b1a1" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.340990 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bc3f947b0549f72f61c69217bbb80101779f1d763878071f798b259e2c7b1a1"} err="failed to get container status \"9bc3f947b0549f72f61c69217bbb80101779f1d763878071f798b259e2c7b1a1\": rpc error: code = NotFound desc = could not find container \"9bc3f947b0549f72f61c69217bbb80101779f1d763878071f798b259e2c7b1a1\": container with ID starting with 9bc3f947b0549f72f61c69217bbb80101779f1d763878071f798b259e2c7b1a1 not found: ID does not exist" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.341023 4594 scope.go:117] "RemoveContainer" containerID="2c4814d9889fc0a1908ca851253ea5f09697f1552c79fd3bb12cd5b8980910ad" Nov 29 05:47:09 crc kubenswrapper[4594]: E1129 05:47:09.341527 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c4814d9889fc0a1908ca851253ea5f09697f1552c79fd3bb12cd5b8980910ad\": container with ID starting with 2c4814d9889fc0a1908ca851253ea5f09697f1552c79fd3bb12cd5b8980910ad not found: ID does not exist" containerID="2c4814d9889fc0a1908ca851253ea5f09697f1552c79fd3bb12cd5b8980910ad" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.341550 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c4814d9889fc0a1908ca851253ea5f09697f1552c79fd3bb12cd5b8980910ad"} err="failed to get container status \"2c4814d9889fc0a1908ca851253ea5f09697f1552c79fd3bb12cd5b8980910ad\": rpc error: code = NotFound desc = could not find container \"2c4814d9889fc0a1908ca851253ea5f09697f1552c79fd3bb12cd5b8980910ad\": container with ID starting with 2c4814d9889fc0a1908ca851253ea5f09697f1552c79fd3bb12cd5b8980910ad not found: ID does not exist" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.341563 4594 scope.go:117] "RemoveContainer" containerID="f3e9b7024c68ce7e91300e5ffa7c5e6e23948505fdc56db5f6e3ce5d0348ede5" Nov 29 05:47:09 crc kubenswrapper[4594]: E1129 05:47:09.341970 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3e9b7024c68ce7e91300e5ffa7c5e6e23948505fdc56db5f6e3ce5d0348ede5\": container with ID starting with f3e9b7024c68ce7e91300e5ffa7c5e6e23948505fdc56db5f6e3ce5d0348ede5 not found: ID does not exist" containerID="f3e9b7024c68ce7e91300e5ffa7c5e6e23948505fdc56db5f6e3ce5d0348ede5" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.341999 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e9b7024c68ce7e91300e5ffa7c5e6e23948505fdc56db5f6e3ce5d0348ede5"} err="failed to get container status \"f3e9b7024c68ce7e91300e5ffa7c5e6e23948505fdc56db5f6e3ce5d0348ede5\": rpc error: code = NotFound desc = could not find container \"f3e9b7024c68ce7e91300e5ffa7c5e6e23948505fdc56db5f6e3ce5d0348ede5\": container with ID starting with f3e9b7024c68ce7e91300e5ffa7c5e6e23948505fdc56db5f6e3ce5d0348ede5 not found: ID does not exist" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.342014 4594 scope.go:117] "RemoveContainer" containerID="2ba957e4d359bd316048e313be1e1ab17a747f386fb91d77a2c9b962f262daf3" Nov 29 05:47:09 crc kubenswrapper[4594]: E1129 05:47:09.342531 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba957e4d359bd316048e313be1e1ab17a747f386fb91d77a2c9b962f262daf3\": container with ID starting with 2ba957e4d359bd316048e313be1e1ab17a747f386fb91d77a2c9b962f262daf3 not found: ID does not exist" containerID="2ba957e4d359bd316048e313be1e1ab17a747f386fb91d77a2c9b962f262daf3" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.342575 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba957e4d359bd316048e313be1e1ab17a747f386fb91d77a2c9b962f262daf3"} err="failed to get container status \"2ba957e4d359bd316048e313be1e1ab17a747f386fb91d77a2c9b962f262daf3\": rpc error: code = NotFound desc = could not find container \"2ba957e4d359bd316048e313be1e1ab17a747f386fb91d77a2c9b962f262daf3\": container with ID starting with 2ba957e4d359bd316048e313be1e1ab17a747f386fb91d77a2c9b962f262daf3 not found: ID does not exist" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.344795 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:47:09 crc kubenswrapper[4594]: E1129 05:47:09.345340 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="proxy-httpd" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.345359 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="proxy-httpd" Nov 29 05:47:09 crc kubenswrapper[4594]: E1129 05:47:09.345377 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="ceilometer-notification-agent" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.345384 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="ceilometer-notification-agent" Nov 29 05:47:09 crc kubenswrapper[4594]: E1129 05:47:09.345411 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="ceilometer-central-agent" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.345417 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="ceilometer-central-agent" Nov 29 05:47:09 crc kubenswrapper[4594]: E1129 05:47:09.345427 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="sg-core" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.345432 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="sg-core" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.345603 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="proxy-httpd" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.345616 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="ceilometer-central-agent" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.345629 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="ceilometer-notification-agent" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.345637 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" containerName="sg-core" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.347486 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.353060 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.353165 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.353503 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.367160 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.519005 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.519092 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.519212 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4bx4\" (UniqueName: \"kubernetes.io/projected/e52f06c3-9f29-498b-8518-a31673a30616-kube-api-access-l4bx4\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.519267 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e52f06c3-9f29-498b-8518-a31673a30616-run-httpd\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.519324 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-config-data\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.519410 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e52f06c3-9f29-498b-8518-a31673a30616-log-httpd\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.519503 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.519567 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-scripts\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.622070 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.622211 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-scripts\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.622296 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.622352 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.623225 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4bx4\" (UniqueName: \"kubernetes.io/projected/e52f06c3-9f29-498b-8518-a31673a30616-kube-api-access-l4bx4\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.623394 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e52f06c3-9f29-498b-8518-a31673a30616-run-httpd\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.623831 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e52f06c3-9f29-498b-8518-a31673a30616-run-httpd\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.623920 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-config-data\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.624671 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e52f06c3-9f29-498b-8518-a31673a30616-log-httpd\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.625276 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e52f06c3-9f29-498b-8518-a31673a30616-log-httpd\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.628734 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.628752 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-scripts\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.628853 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.628887 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.631121 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-config-data\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.641443 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4bx4\" (UniqueName: \"kubernetes.io/projected/e52f06c3-9f29-498b-8518-a31673a30616-kube-api-access-l4bx4\") pod \"ceilometer-0\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " pod="openstack/ceilometer-0" Nov 29 05:47:09 crc kubenswrapper[4594]: I1129 05:47:09.671053 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:47:10 crc kubenswrapper[4594]: I1129 05:47:10.076028 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:47:10 crc kubenswrapper[4594]: I1129 05:47:10.094020 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532aef10-6fe3-4820-b619-f94f07fa5d43" path="/var/lib/kubelet/pods/532aef10-6fe3-4820-b619-f94f07fa5d43/volumes" Nov 29 05:47:10 crc kubenswrapper[4594]: I1129 05:47:10.995942 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e52f06c3-9f29-498b-8518-a31673a30616","Type":"ContainerStarted","Data":"371aad41d52c16b9546dc46fb3af3658b95c7ae958c28c6b22f54b2deffa9bdd"} Nov 29 05:47:10 crc kubenswrapper[4594]: I1129 05:47:10.996213 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e52f06c3-9f29-498b-8518-a31673a30616","Type":"ContainerStarted","Data":"9e45d99d7bc4a19336e7ba450bf1bd15e78cd2ff6ce32d31d6cd78c7d1bfacb7"} Nov 29 05:47:12 crc kubenswrapper[4594]: I1129 05:47:12.006408 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e52f06c3-9f29-498b-8518-a31673a30616","Type":"ContainerStarted","Data":"404bc01342b6b39f78e99296b674aea0700199548bba3bde45cfcdc942c2626b"} Nov 29 05:47:12 crc kubenswrapper[4594]: I1129 05:47:12.213737 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 05:47:12 crc kubenswrapper[4594]: I1129 05:47:12.214180 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 05:47:12 crc kubenswrapper[4594]: I1129 05:47:12.335996 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 05:47:13 crc kubenswrapper[4594]: I1129 05:47:13.020326 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e52f06c3-9f29-498b-8518-a31673a30616","Type":"ContainerStarted","Data":"353ab464dfc32ff8f13bb887eb85f5b00b6dfdc3104efa73e6d52e3a9d110023"} Nov 29 05:47:13 crc kubenswrapper[4594]: I1129 05:47:13.229387 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="241daf99-de92-4580-bec5-138d9356b784" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 05:47:13 crc kubenswrapper[4594]: I1129 05:47:13.229411 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="241daf99-de92-4580-bec5-138d9356b784" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 05:47:15 crc kubenswrapper[4594]: I1129 05:47:15.046293 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e52f06c3-9f29-498b-8518-a31673a30616","Type":"ContainerStarted","Data":"56ad0e12e45f5db8c752a7e05c7bb41fa8620e05720406c30067c88dcdaca33d"} Nov 29 05:47:15 crc kubenswrapper[4594]: I1129 05:47:15.048466 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 05:47:15 crc kubenswrapper[4594]: I1129 05:47:15.071314 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.200298464 podStartE2EDuration="6.071300193s" podCreationTimestamp="2025-11-29 05:47:09 +0000 UTC" firstStartedPulling="2025-11-29 05:47:10.083814834 +0000 UTC m=+1154.324324054" lastFinishedPulling="2025-11-29 05:47:13.954816562 +0000 UTC m=+1158.195325783" observedRunningTime="2025-11-29 05:47:15.068231622 +0000 UTC m=+1159.308740841" watchObservedRunningTime="2025-11-29 05:47:15.071300193 +0000 UTC m=+1159.311809412" Nov 29 05:47:15 crc kubenswrapper[4594]: I1129 05:47:15.292583 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 05:47:15 crc kubenswrapper[4594]: I1129 05:47:15.292637 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 05:47:16 crc kubenswrapper[4594]: I1129 05:47:16.376446 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5e95c8b5-5c50-40fb-8e08-62364581c443" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 05:47:16 crc kubenswrapper[4594]: I1129 05:47:16.376562 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5e95c8b5-5c50-40fb-8e08-62364581c443" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 05:47:16 crc kubenswrapper[4594]: I1129 05:47:16.390079 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 29 05:47:17 crc kubenswrapper[4594]: I1129 05:47:17.336080 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 29 05:47:17 crc kubenswrapper[4594]: I1129 05:47:17.386320 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 29 05:47:18 crc kubenswrapper[4594]: I1129 05:47:18.106097 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 29 05:47:22 crc kubenswrapper[4594]: I1129 05:47:22.218072 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 05:47:22 crc kubenswrapper[4594]: I1129 05:47:22.220717 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 05:47:22 crc kubenswrapper[4594]: I1129 05:47:22.227114 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 05:47:23 crc kubenswrapper[4594]: I1129 05:47:23.132413 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.064441 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.146732 4594 generic.go:334] "Generic (PLEG): container finished" podID="1df3200b-0773-4e3c-9110-fe893d7efc79" containerID="a73a37116984c4c20c166844a6cc203d06cbfda6802e806de7d667587d225a89" exitCode=137 Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.146791 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.146799 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1df3200b-0773-4e3c-9110-fe893d7efc79","Type":"ContainerDied","Data":"a73a37116984c4c20c166844a6cc203d06cbfda6802e806de7d667587d225a89"} Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.146869 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1df3200b-0773-4e3c-9110-fe893d7efc79","Type":"ContainerDied","Data":"dbe4dd50c6c19c8961061843021c0918c1e13e235f967ede8de61aa5d1726358"} Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.146893 4594 scope.go:117] "RemoveContainer" containerID="a73a37116984c4c20c166844a6cc203d06cbfda6802e806de7d667587d225a89" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.164314 4594 scope.go:117] "RemoveContainer" containerID="a73a37116984c4c20c166844a6cc203d06cbfda6802e806de7d667587d225a89" Nov 29 05:47:25 crc kubenswrapper[4594]: E1129 05:47:25.164774 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a73a37116984c4c20c166844a6cc203d06cbfda6802e806de7d667587d225a89\": container with ID starting with a73a37116984c4c20c166844a6cc203d06cbfda6802e806de7d667587d225a89 not found: ID does not exist" containerID="a73a37116984c4c20c166844a6cc203d06cbfda6802e806de7d667587d225a89" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.164818 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a73a37116984c4c20c166844a6cc203d06cbfda6802e806de7d667587d225a89"} err="failed to get container status \"a73a37116984c4c20c166844a6cc203d06cbfda6802e806de7d667587d225a89\": rpc error: code = NotFound desc = could not find container \"a73a37116984c4c20c166844a6cc203d06cbfda6802e806de7d667587d225a89\": container with ID starting with a73a37116984c4c20c166844a6cc203d06cbfda6802e806de7d667587d225a89 not found: ID does not exist" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.166367 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df3200b-0773-4e3c-9110-fe893d7efc79-config-data\") pod \"1df3200b-0773-4e3c-9110-fe893d7efc79\" (UID: \"1df3200b-0773-4e3c-9110-fe893d7efc79\") " Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.166723 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngx5v\" (UniqueName: \"kubernetes.io/projected/1df3200b-0773-4e3c-9110-fe893d7efc79-kube-api-access-ngx5v\") pod \"1df3200b-0773-4e3c-9110-fe893d7efc79\" (UID: \"1df3200b-0773-4e3c-9110-fe893d7efc79\") " Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.166995 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df3200b-0773-4e3c-9110-fe893d7efc79-combined-ca-bundle\") pod \"1df3200b-0773-4e3c-9110-fe893d7efc79\" (UID: \"1df3200b-0773-4e3c-9110-fe893d7efc79\") " Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.171458 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df3200b-0773-4e3c-9110-fe893d7efc79-kube-api-access-ngx5v" (OuterVolumeSpecName: "kube-api-access-ngx5v") pod "1df3200b-0773-4e3c-9110-fe893d7efc79" (UID: "1df3200b-0773-4e3c-9110-fe893d7efc79"). InnerVolumeSpecName "kube-api-access-ngx5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.193456 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df3200b-0773-4e3c-9110-fe893d7efc79-config-data" (OuterVolumeSpecName: "config-data") pod "1df3200b-0773-4e3c-9110-fe893d7efc79" (UID: "1df3200b-0773-4e3c-9110-fe893d7efc79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.195109 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df3200b-0773-4e3c-9110-fe893d7efc79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1df3200b-0773-4e3c-9110-fe893d7efc79" (UID: "1df3200b-0773-4e3c-9110-fe893d7efc79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.270897 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngx5v\" (UniqueName: \"kubernetes.io/projected/1df3200b-0773-4e3c-9110-fe893d7efc79-kube-api-access-ngx5v\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.270926 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df3200b-0773-4e3c-9110-fe893d7efc79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.270936 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df3200b-0773-4e3c-9110-fe893d7efc79-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.301576 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.302103 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.302639 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.302703 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.307053 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.309899 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.472027 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-sp4l2"] Nov 29 05:47:25 crc kubenswrapper[4594]: E1129 05:47:25.472753 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df3200b-0773-4e3c-9110-fe893d7efc79" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.472773 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df3200b-0773-4e3c-9110-fe893d7efc79" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.472988 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df3200b-0773-4e3c-9110-fe893d7efc79" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.474299 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.497361 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-sp4l2"] Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.521900 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.545704 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.555772 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.557134 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.559877 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.560051 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.560076 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.583005 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-ovsdbserver-sb\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.583243 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-ovsdbserver-nb\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.583333 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-dns-swift-storage-0\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.583357 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-config\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.583447 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-dns-svc\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.583513 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bmpc\" (UniqueName: \"kubernetes.io/projected/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-kube-api-access-8bmpc\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.586493 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.686670 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmmqr\" (UniqueName: \"kubernetes.io/projected/9cbd6039-37fe-4ad5-9149-441d6e5d1812-kube-api-access-vmmqr\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.686853 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-ovsdbserver-nb\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.686890 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbd6039-37fe-4ad5-9149-441d6e5d1812-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.686999 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-dns-swift-storage-0\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.687037 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-config\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.687176 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-dns-svc\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.687297 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bmpc\" (UniqueName: \"kubernetes.io/projected/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-kube-api-access-8bmpc\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.687403 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbd6039-37fe-4ad5-9149-441d6e5d1812-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.687450 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cbd6039-37fe-4ad5-9149-441d6e5d1812-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.687488 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-ovsdbserver-sb\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.687554 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cbd6039-37fe-4ad5-9149-441d6e5d1812-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.688134 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-config\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.688161 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-dns-swift-storage-0\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.688549 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-ovsdbserver-nb\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.688629 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-dns-svc\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.689031 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-ovsdbserver-sb\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.703824 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bmpc\" (UniqueName: \"kubernetes.io/projected/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-kube-api-access-8bmpc\") pod \"dnsmasq-dns-54599d8f7-sp4l2\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.790435 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbd6039-37fe-4ad5-9149-441d6e5d1812-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.790545 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cbd6039-37fe-4ad5-9149-441d6e5d1812-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.790673 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cbd6039-37fe-4ad5-9149-441d6e5d1812-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.790875 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmmqr\" (UniqueName: \"kubernetes.io/projected/9cbd6039-37fe-4ad5-9149-441d6e5d1812-kube-api-access-vmmqr\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.790983 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbd6039-37fe-4ad5-9149-441d6e5d1812-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.795605 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbd6039-37fe-4ad5-9149-441d6e5d1812-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.810914 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.832752 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbd6039-37fe-4ad5-9149-441d6e5d1812-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.833528 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cbd6039-37fe-4ad5-9149-441d6e5d1812-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.836106 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmmqr\" (UniqueName: \"kubernetes.io/projected/9cbd6039-37fe-4ad5-9149-441d6e5d1812-kube-api-access-vmmqr\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.845095 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cbd6039-37fe-4ad5-9149-441d6e5d1812-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9cbd6039-37fe-4ad5-9149-441d6e5d1812\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:25 crc kubenswrapper[4594]: I1129 05:47:25.883339 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:26 crc kubenswrapper[4594]: I1129 05:47:26.096930 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df3200b-0773-4e3c-9110-fe893d7efc79" path="/var/lib/kubelet/pods/1df3200b-0773-4e3c-9110-fe893d7efc79/volumes" Nov 29 05:47:26 crc kubenswrapper[4594]: I1129 05:47:26.288508 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-sp4l2"] Nov 29 05:47:26 crc kubenswrapper[4594]: W1129 05:47:26.427953 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cbd6039_37fe_4ad5_9149_441d6e5d1812.slice/crio-332cc1e66d529607ec49f9a039bbd47edc45899ab8f5f22f1c5e8be2a2bb04f6 WatchSource:0}: Error finding container 332cc1e66d529607ec49f9a039bbd47edc45899ab8f5f22f1c5e8be2a2bb04f6: Status 404 returned error can't find the container with id 332cc1e66d529607ec49f9a039bbd47edc45899ab8f5f22f1c5e8be2a2bb04f6 Nov 29 05:47:26 crc kubenswrapper[4594]: I1129 05:47:26.433311 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 05:47:27 crc kubenswrapper[4594]: I1129 05:47:27.184868 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9cbd6039-37fe-4ad5-9149-441d6e5d1812","Type":"ContainerStarted","Data":"a38b420f85003d669240706324d1a2c78f48d7ed85bff90dcbb1040886a969c2"} Nov 29 05:47:27 crc kubenswrapper[4594]: I1129 05:47:27.185210 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9cbd6039-37fe-4ad5-9149-441d6e5d1812","Type":"ContainerStarted","Data":"332cc1e66d529607ec49f9a039bbd47edc45899ab8f5f22f1c5e8be2a2bb04f6"} Nov 29 05:47:27 crc kubenswrapper[4594]: I1129 05:47:27.187047 4594 generic.go:334] "Generic (PLEG): container finished" podID="0ad2733e-97bd-4b40-87e3-cd5b16a8479e" containerID="5bd374b8d188d0e28ccb8782a8fa16b261c7e1ef10aef3406971f60dd6dedfab" exitCode=0 Nov 29 05:47:27 crc kubenswrapper[4594]: I1129 05:47:27.187141 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" event={"ID":"0ad2733e-97bd-4b40-87e3-cd5b16a8479e","Type":"ContainerDied","Data":"5bd374b8d188d0e28ccb8782a8fa16b261c7e1ef10aef3406971f60dd6dedfab"} Nov 29 05:47:27 crc kubenswrapper[4594]: I1129 05:47:27.187182 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" event={"ID":"0ad2733e-97bd-4b40-87e3-cd5b16a8479e","Type":"ContainerStarted","Data":"d5ef8ba38824cbdb5e58ac8c0fba1f50f1a893a56b9d3d7bd7129a858b752a21"} Nov 29 05:47:27 crc kubenswrapper[4594]: I1129 05:47:27.215144 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.215124842 podStartE2EDuration="2.215124842s" podCreationTimestamp="2025-11-29 05:47:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:47:27.213503743 +0000 UTC m=+1171.454012963" watchObservedRunningTime="2025-11-29 05:47:27.215124842 +0000 UTC m=+1171.455634062" Nov 29 05:47:27 crc kubenswrapper[4594]: I1129 05:47:27.561492 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:47:27 crc kubenswrapper[4594]: I1129 05:47:27.562209 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="sg-core" containerID="cri-o://353ab464dfc32ff8f13bb887eb85f5b00b6dfdc3104efa73e6d52e3a9d110023" gracePeriod=30 Nov 29 05:47:27 crc kubenswrapper[4594]: I1129 05:47:27.562428 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="proxy-httpd" containerID="cri-o://56ad0e12e45f5db8c752a7e05c7bb41fa8620e05720406c30067c88dcdaca33d" gracePeriod=30 Nov 29 05:47:27 crc kubenswrapper[4594]: I1129 05:47:27.562735 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="ceilometer-notification-agent" containerID="cri-o://404bc01342b6b39f78e99296b674aea0700199548bba3bde45cfcdc942c2626b" gracePeriod=30 Nov 29 05:47:27 crc kubenswrapper[4594]: I1129 05:47:27.563093 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="ceilometer-central-agent" containerID="cri-o://371aad41d52c16b9546dc46fb3af3658b95c7ae958c28c6b22f54b2deffa9bdd" gracePeriod=30 Nov 29 05:47:27 crc kubenswrapper[4594]: I1129 05:47:27.576496 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.215:3000/\": EOF" Nov 29 05:47:27 crc kubenswrapper[4594]: I1129 05:47:27.724937 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:28 crc kubenswrapper[4594]: I1129 05:47:28.197618 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" event={"ID":"0ad2733e-97bd-4b40-87e3-cd5b16a8479e","Type":"ContainerStarted","Data":"a2df68ae0304176b1714506c9cb937a93d6aac56aa0ca3336147889e2aae241a"} Nov 29 05:47:28 crc kubenswrapper[4594]: I1129 05:47:28.198057 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:28 crc kubenswrapper[4594]: I1129 05:47:28.200767 4594 generic.go:334] "Generic (PLEG): container finished" podID="e52f06c3-9f29-498b-8518-a31673a30616" containerID="56ad0e12e45f5db8c752a7e05c7bb41fa8620e05720406c30067c88dcdaca33d" exitCode=0 Nov 29 05:47:28 crc kubenswrapper[4594]: I1129 05:47:28.200805 4594 generic.go:334] "Generic (PLEG): container finished" podID="e52f06c3-9f29-498b-8518-a31673a30616" containerID="353ab464dfc32ff8f13bb887eb85f5b00b6dfdc3104efa73e6d52e3a9d110023" exitCode=2 Nov 29 05:47:28 crc kubenswrapper[4594]: I1129 05:47:28.200825 4594 generic.go:334] "Generic (PLEG): container finished" podID="e52f06c3-9f29-498b-8518-a31673a30616" containerID="371aad41d52c16b9546dc46fb3af3658b95c7ae958c28c6b22f54b2deffa9bdd" exitCode=0 Nov 29 05:47:28 crc kubenswrapper[4594]: I1129 05:47:28.201081 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e52f06c3-9f29-498b-8518-a31673a30616","Type":"ContainerDied","Data":"56ad0e12e45f5db8c752a7e05c7bb41fa8620e05720406c30067c88dcdaca33d"} Nov 29 05:47:28 crc kubenswrapper[4594]: I1129 05:47:28.201116 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e52f06c3-9f29-498b-8518-a31673a30616","Type":"ContainerDied","Data":"353ab464dfc32ff8f13bb887eb85f5b00b6dfdc3104efa73e6d52e3a9d110023"} Nov 29 05:47:28 crc kubenswrapper[4594]: I1129 05:47:28.201131 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e52f06c3-9f29-498b-8518-a31673a30616","Type":"ContainerDied","Data":"371aad41d52c16b9546dc46fb3af3658b95c7ae958c28c6b22f54b2deffa9bdd"} Nov 29 05:47:28 crc kubenswrapper[4594]: I1129 05:47:28.201317 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5e95c8b5-5c50-40fb-8e08-62364581c443" containerName="nova-api-log" containerID="cri-o://bbf24758c41d9f0a53285a8a6ce43ca911ea4e425d6c1af057cc06d6e0c9e486" gracePeriod=30 Nov 29 05:47:28 crc kubenswrapper[4594]: I1129 05:47:28.201443 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5e95c8b5-5c50-40fb-8e08-62364581c443" containerName="nova-api-api" containerID="cri-o://129b708614fdc255084b4e1843ede61f086273b325f88b0f4de3146a7b3a15aa" gracePeriod=30 Nov 29 05:47:28 crc kubenswrapper[4594]: I1129 05:47:28.230042 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" podStartSLOduration=3.230010604 podStartE2EDuration="3.230010604s" podCreationTimestamp="2025-11-29 05:47:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:47:28.218123789 +0000 UTC m=+1172.458633009" watchObservedRunningTime="2025-11-29 05:47:28.230010604 +0000 UTC m=+1172.470519825" Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.215780 4594 generic.go:334] "Generic (PLEG): container finished" podID="5e95c8b5-5c50-40fb-8e08-62364581c443" containerID="129b708614fdc255084b4e1843ede61f086273b325f88b0f4de3146a7b3a15aa" exitCode=0 Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.216123 4594 generic.go:334] "Generic (PLEG): container finished" podID="5e95c8b5-5c50-40fb-8e08-62364581c443" containerID="bbf24758c41d9f0a53285a8a6ce43ca911ea4e425d6c1af057cc06d6e0c9e486" exitCode=143 Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.215864 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5e95c8b5-5c50-40fb-8e08-62364581c443","Type":"ContainerDied","Data":"129b708614fdc255084b4e1843ede61f086273b325f88b0f4de3146a7b3a15aa"} Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.216177 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5e95c8b5-5c50-40fb-8e08-62364581c443","Type":"ContainerDied","Data":"bbf24758c41d9f0a53285a8a6ce43ca911ea4e425d6c1af057cc06d6e0c9e486"} Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.509187 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.588320 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e95c8b5-5c50-40fb-8e08-62364581c443-logs\") pod \"5e95c8b5-5c50-40fb-8e08-62364581c443\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.588398 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gsjn\" (UniqueName: \"kubernetes.io/projected/5e95c8b5-5c50-40fb-8e08-62364581c443-kube-api-access-2gsjn\") pod \"5e95c8b5-5c50-40fb-8e08-62364581c443\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.588427 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e95c8b5-5c50-40fb-8e08-62364581c443-config-data\") pod \"5e95c8b5-5c50-40fb-8e08-62364581c443\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.588557 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e95c8b5-5c50-40fb-8e08-62364581c443-combined-ca-bundle\") pod \"5e95c8b5-5c50-40fb-8e08-62364581c443\" (UID: \"5e95c8b5-5c50-40fb-8e08-62364581c443\") " Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.590442 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e95c8b5-5c50-40fb-8e08-62364581c443-logs" (OuterVolumeSpecName: "logs") pod "5e95c8b5-5c50-40fb-8e08-62364581c443" (UID: "5e95c8b5-5c50-40fb-8e08-62364581c443"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.595660 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e95c8b5-5c50-40fb-8e08-62364581c443-kube-api-access-2gsjn" (OuterVolumeSpecName: "kube-api-access-2gsjn") pod "5e95c8b5-5c50-40fb-8e08-62364581c443" (UID: "5e95c8b5-5c50-40fb-8e08-62364581c443"). InnerVolumeSpecName "kube-api-access-2gsjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.620099 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e95c8b5-5c50-40fb-8e08-62364581c443-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e95c8b5-5c50-40fb-8e08-62364581c443" (UID: "5e95c8b5-5c50-40fb-8e08-62364581c443"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.623889 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e95c8b5-5c50-40fb-8e08-62364581c443-config-data" (OuterVolumeSpecName: "config-data") pod "5e95c8b5-5c50-40fb-8e08-62364581c443" (UID: "5e95c8b5-5c50-40fb-8e08-62364581c443"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.692781 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e95c8b5-5c50-40fb-8e08-62364581c443-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.692834 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e95c8b5-5c50-40fb-8e08-62364581c443-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.692848 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gsjn\" (UniqueName: \"kubernetes.io/projected/5e95c8b5-5c50-40fb-8e08-62364581c443-kube-api-access-2gsjn\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:29 crc kubenswrapper[4594]: I1129 05:47:29.692864 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e95c8b5-5c50-40fb-8e08-62364581c443-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.224957 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5e95c8b5-5c50-40fb-8e08-62364581c443","Type":"ContainerDied","Data":"eb7aeaa97d61cca22e891792c3fe017fa40ff4423a8dccfbaf8614d58223e908"} Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.225005 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.225288 4594 scope.go:117] "RemoveContainer" containerID="129b708614fdc255084b4e1843ede61f086273b325f88b0f4de3146a7b3a15aa" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.247368 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.260362 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.267346 4594 scope.go:117] "RemoveContainer" containerID="bbf24758c41d9f0a53285a8a6ce43ca911ea4e425d6c1af057cc06d6e0c9e486" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.281754 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:30 crc kubenswrapper[4594]: E1129 05:47:30.283431 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e95c8b5-5c50-40fb-8e08-62364581c443" containerName="nova-api-api" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.283451 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e95c8b5-5c50-40fb-8e08-62364581c443" containerName="nova-api-api" Nov 29 05:47:30 crc kubenswrapper[4594]: E1129 05:47:30.283502 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e95c8b5-5c50-40fb-8e08-62364581c443" containerName="nova-api-log" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.283509 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e95c8b5-5c50-40fb-8e08-62364581c443" containerName="nova-api-log" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.283718 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e95c8b5-5c50-40fb-8e08-62364581c443" containerName="nova-api-log" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.283741 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e95c8b5-5c50-40fb-8e08-62364581c443" containerName="nova-api-api" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.284914 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.287517 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.287855 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.288000 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.311570 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.408349 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-config-data\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.408410 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-public-tls-certs\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.408698 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.408765 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.408931 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-logs\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.409002 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m24fx\" (UniqueName: \"kubernetes.io/projected/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-kube-api-access-m24fx\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.511271 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.511334 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.511385 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-logs\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.511442 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m24fx\" (UniqueName: \"kubernetes.io/projected/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-kube-api-access-m24fx\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.511940 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-logs\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.512036 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-config-data\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.512739 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-public-tls-certs\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.516807 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.517730 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-config-data\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.518312 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.518643 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-public-tls-certs\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.538794 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m24fx\" (UniqueName: \"kubernetes.io/projected/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-kube-api-access-m24fx\") pod \"nova-api-0\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.605547 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.867858 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:30 crc kubenswrapper[4594]: I1129 05:47:30.884405 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:31 crc kubenswrapper[4594]: I1129 05:47:31.266347 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d02a9e40-3da0-4d17-8694-4f67b1bb50b6","Type":"ContainerStarted","Data":"09e82956123a0bdf123549891b98d61d5c628e8cbd806667e129f60b90cedcf3"} Nov 29 05:47:31 crc kubenswrapper[4594]: I1129 05:47:31.266881 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d02a9e40-3da0-4d17-8694-4f67b1bb50b6","Type":"ContainerStarted","Data":"a3f24e5456d2f677492f6d792212f1682e5d00bcff4ac9cc9dfe9b59ba6bfcfa"} Nov 29 05:47:31 crc kubenswrapper[4594]: I1129 05:47:31.267020 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d02a9e40-3da0-4d17-8694-4f67b1bb50b6","Type":"ContainerStarted","Data":"e51d1993b3c44d2be750f41405d59f5b5ff990c76c2f9636ccb0f78adc84c6c5"} Nov 29 05:47:31 crc kubenswrapper[4594]: I1129 05:47:31.293869 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.293851393 podStartE2EDuration="1.293851393s" podCreationTimestamp="2025-11-29 05:47:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:47:31.285857889 +0000 UTC m=+1175.526367109" watchObservedRunningTime="2025-11-29 05:47:31.293851393 +0000 UTC m=+1175.534360613" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.069834 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.095751 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e95c8b5-5c50-40fb-8e08-62364581c443" path="/var/lib/kubelet/pods/5e95c8b5-5c50-40fb-8e08-62364581c443/volumes" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.153306 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-ceilometer-tls-certs\") pod \"e52f06c3-9f29-498b-8518-a31673a30616\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.153396 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-sg-core-conf-yaml\") pod \"e52f06c3-9f29-498b-8518-a31673a30616\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.153548 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-combined-ca-bundle\") pod \"e52f06c3-9f29-498b-8518-a31673a30616\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.153761 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-config-data\") pod \"e52f06c3-9f29-498b-8518-a31673a30616\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.153802 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e52f06c3-9f29-498b-8518-a31673a30616-run-httpd\") pod \"e52f06c3-9f29-498b-8518-a31673a30616\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.153898 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4bx4\" (UniqueName: \"kubernetes.io/projected/e52f06c3-9f29-498b-8518-a31673a30616-kube-api-access-l4bx4\") pod \"e52f06c3-9f29-498b-8518-a31673a30616\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.153941 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e52f06c3-9f29-498b-8518-a31673a30616-log-httpd\") pod \"e52f06c3-9f29-498b-8518-a31673a30616\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.153972 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-scripts\") pod \"e52f06c3-9f29-498b-8518-a31673a30616\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.157917 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52f06c3-9f29-498b-8518-a31673a30616-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e52f06c3-9f29-498b-8518-a31673a30616" (UID: "e52f06c3-9f29-498b-8518-a31673a30616"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.160020 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52f06c3-9f29-498b-8518-a31673a30616-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e52f06c3-9f29-498b-8518-a31673a30616" (UID: "e52f06c3-9f29-498b-8518-a31673a30616"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.162461 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-scripts" (OuterVolumeSpecName: "scripts") pod "e52f06c3-9f29-498b-8518-a31673a30616" (UID: "e52f06c3-9f29-498b-8518-a31673a30616"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.162728 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e52f06c3-9f29-498b-8518-a31673a30616-kube-api-access-l4bx4" (OuterVolumeSpecName: "kube-api-access-l4bx4") pod "e52f06c3-9f29-498b-8518-a31673a30616" (UID: "e52f06c3-9f29-498b-8518-a31673a30616"). InnerVolumeSpecName "kube-api-access-l4bx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.195346 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e52f06c3-9f29-498b-8518-a31673a30616" (UID: "e52f06c3-9f29-498b-8518-a31673a30616"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.214431 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e52f06c3-9f29-498b-8518-a31673a30616" (UID: "e52f06c3-9f29-498b-8518-a31673a30616"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.247607 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-config-data" (OuterVolumeSpecName: "config-data") pod "e52f06c3-9f29-498b-8518-a31673a30616" (UID: "e52f06c3-9f29-498b-8518-a31673a30616"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.255549 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e52f06c3-9f29-498b-8518-a31673a30616" (UID: "e52f06c3-9f29-498b-8518-a31673a30616"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.256651 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-combined-ca-bundle\") pod \"e52f06c3-9f29-498b-8518-a31673a30616\" (UID: \"e52f06c3-9f29-498b-8518-a31673a30616\") " Nov 29 05:47:32 crc kubenswrapper[4594]: W1129 05:47:32.256847 4594 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/e52f06c3-9f29-498b-8518-a31673a30616/volumes/kubernetes.io~secret/combined-ca-bundle Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.256935 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e52f06c3-9f29-498b-8518-a31673a30616" (UID: "e52f06c3-9f29-498b-8518-a31673a30616"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.257601 4594 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.257620 4594 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.257631 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.257641 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.257650 4594 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e52f06c3-9f29-498b-8518-a31673a30616-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.257660 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4bx4\" (UniqueName: \"kubernetes.io/projected/e52f06c3-9f29-498b-8518-a31673a30616-kube-api-access-l4bx4\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.257673 4594 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e52f06c3-9f29-498b-8518-a31673a30616-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.257687 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e52f06c3-9f29-498b-8518-a31673a30616-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.280556 4594 generic.go:334] "Generic (PLEG): container finished" podID="e52f06c3-9f29-498b-8518-a31673a30616" containerID="404bc01342b6b39f78e99296b674aea0700199548bba3bde45cfcdc942c2626b" exitCode=0 Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.280652 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e52f06c3-9f29-498b-8518-a31673a30616","Type":"ContainerDied","Data":"404bc01342b6b39f78e99296b674aea0700199548bba3bde45cfcdc942c2626b"} Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.280706 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e52f06c3-9f29-498b-8518-a31673a30616","Type":"ContainerDied","Data":"9e45d99d7bc4a19336e7ba450bf1bd15e78cd2ff6ce32d31d6cd78c7d1bfacb7"} Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.280731 4594 scope.go:117] "RemoveContainer" containerID="56ad0e12e45f5db8c752a7e05c7bb41fa8620e05720406c30067c88dcdaca33d" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.281628 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.302792 4594 scope.go:117] "RemoveContainer" containerID="353ab464dfc32ff8f13bb887eb85f5b00b6dfdc3104efa73e6d52e3a9d110023" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.323662 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.327994 4594 scope.go:117] "RemoveContainer" containerID="404bc01342b6b39f78e99296b674aea0700199548bba3bde45cfcdc942c2626b" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.343582 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.356313 4594 scope.go:117] "RemoveContainer" containerID="371aad41d52c16b9546dc46fb3af3658b95c7ae958c28c6b22f54b2deffa9bdd" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.361555 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:47:32 crc kubenswrapper[4594]: E1129 05:47:32.362000 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="sg-core" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.362019 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="sg-core" Nov 29 05:47:32 crc kubenswrapper[4594]: E1129 05:47:32.362034 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="ceilometer-central-agent" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.362041 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="ceilometer-central-agent" Nov 29 05:47:32 crc kubenswrapper[4594]: E1129 05:47:32.362050 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="ceilometer-notification-agent" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.362055 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="ceilometer-notification-agent" Nov 29 05:47:32 crc kubenswrapper[4594]: E1129 05:47:32.362083 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="proxy-httpd" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.362089 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="proxy-httpd" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.362268 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="ceilometer-notification-agent" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.362283 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="ceilometer-central-agent" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.362295 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="proxy-httpd" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.362303 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e52f06c3-9f29-498b-8518-a31673a30616" containerName="sg-core" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.364057 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.370195 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.370625 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.370762 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.390279 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.409550 4594 scope.go:117] "RemoveContainer" containerID="56ad0e12e45f5db8c752a7e05c7bb41fa8620e05720406c30067c88dcdaca33d" Nov 29 05:47:32 crc kubenswrapper[4594]: E1129 05:47:32.409956 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56ad0e12e45f5db8c752a7e05c7bb41fa8620e05720406c30067c88dcdaca33d\": container with ID starting with 56ad0e12e45f5db8c752a7e05c7bb41fa8620e05720406c30067c88dcdaca33d not found: ID does not exist" containerID="56ad0e12e45f5db8c752a7e05c7bb41fa8620e05720406c30067c88dcdaca33d" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.410003 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56ad0e12e45f5db8c752a7e05c7bb41fa8620e05720406c30067c88dcdaca33d"} err="failed to get container status \"56ad0e12e45f5db8c752a7e05c7bb41fa8620e05720406c30067c88dcdaca33d\": rpc error: code = NotFound desc = could not find container \"56ad0e12e45f5db8c752a7e05c7bb41fa8620e05720406c30067c88dcdaca33d\": container with ID starting with 56ad0e12e45f5db8c752a7e05c7bb41fa8620e05720406c30067c88dcdaca33d not found: ID does not exist" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.410034 4594 scope.go:117] "RemoveContainer" containerID="353ab464dfc32ff8f13bb887eb85f5b00b6dfdc3104efa73e6d52e3a9d110023" Nov 29 05:47:32 crc kubenswrapper[4594]: E1129 05:47:32.410464 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"353ab464dfc32ff8f13bb887eb85f5b00b6dfdc3104efa73e6d52e3a9d110023\": container with ID starting with 353ab464dfc32ff8f13bb887eb85f5b00b6dfdc3104efa73e6d52e3a9d110023 not found: ID does not exist" containerID="353ab464dfc32ff8f13bb887eb85f5b00b6dfdc3104efa73e6d52e3a9d110023" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.410492 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"353ab464dfc32ff8f13bb887eb85f5b00b6dfdc3104efa73e6d52e3a9d110023"} err="failed to get container status \"353ab464dfc32ff8f13bb887eb85f5b00b6dfdc3104efa73e6d52e3a9d110023\": rpc error: code = NotFound desc = could not find container \"353ab464dfc32ff8f13bb887eb85f5b00b6dfdc3104efa73e6d52e3a9d110023\": container with ID starting with 353ab464dfc32ff8f13bb887eb85f5b00b6dfdc3104efa73e6d52e3a9d110023 not found: ID does not exist" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.410508 4594 scope.go:117] "RemoveContainer" containerID="404bc01342b6b39f78e99296b674aea0700199548bba3bde45cfcdc942c2626b" Nov 29 05:47:32 crc kubenswrapper[4594]: E1129 05:47:32.412242 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"404bc01342b6b39f78e99296b674aea0700199548bba3bde45cfcdc942c2626b\": container with ID starting with 404bc01342b6b39f78e99296b674aea0700199548bba3bde45cfcdc942c2626b not found: ID does not exist" containerID="404bc01342b6b39f78e99296b674aea0700199548bba3bde45cfcdc942c2626b" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.412331 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"404bc01342b6b39f78e99296b674aea0700199548bba3bde45cfcdc942c2626b"} err="failed to get container status \"404bc01342b6b39f78e99296b674aea0700199548bba3bde45cfcdc942c2626b\": rpc error: code = NotFound desc = could not find container \"404bc01342b6b39f78e99296b674aea0700199548bba3bde45cfcdc942c2626b\": container with ID starting with 404bc01342b6b39f78e99296b674aea0700199548bba3bde45cfcdc942c2626b not found: ID does not exist" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.412362 4594 scope.go:117] "RemoveContainer" containerID="371aad41d52c16b9546dc46fb3af3658b95c7ae958c28c6b22f54b2deffa9bdd" Nov 29 05:47:32 crc kubenswrapper[4594]: E1129 05:47:32.412709 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"371aad41d52c16b9546dc46fb3af3658b95c7ae958c28c6b22f54b2deffa9bdd\": container with ID starting with 371aad41d52c16b9546dc46fb3af3658b95c7ae958c28c6b22f54b2deffa9bdd not found: ID does not exist" containerID="371aad41d52c16b9546dc46fb3af3658b95c7ae958c28c6b22f54b2deffa9bdd" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.412732 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"371aad41d52c16b9546dc46fb3af3658b95c7ae958c28c6b22f54b2deffa9bdd"} err="failed to get container status \"371aad41d52c16b9546dc46fb3af3658b95c7ae958c28c6b22f54b2deffa9bdd\": rpc error: code = NotFound desc = could not find container \"371aad41d52c16b9546dc46fb3af3658b95c7ae958c28c6b22f54b2deffa9bdd\": container with ID starting with 371aad41d52c16b9546dc46fb3af3658b95c7ae958c28c6b22f54b2deffa9bdd not found: ID does not exist" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.462669 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.462752 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzwl8\" (UniqueName: \"kubernetes.io/projected/c118ff62-66e3-4359-9122-ebc78d1a1f3d-kube-api-access-rzwl8\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.462952 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-scripts\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.463061 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.463280 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-config-data\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.463428 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.463550 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c118ff62-66e3-4359-9122-ebc78d1a1f3d-run-httpd\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.463608 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c118ff62-66e3-4359-9122-ebc78d1a1f3d-log-httpd\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.566355 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c118ff62-66e3-4359-9122-ebc78d1a1f3d-run-httpd\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.566411 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c118ff62-66e3-4359-9122-ebc78d1a1f3d-log-httpd\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.566459 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.566493 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzwl8\" (UniqueName: \"kubernetes.io/projected/c118ff62-66e3-4359-9122-ebc78d1a1f3d-kube-api-access-rzwl8\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.566527 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-scripts\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.566562 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.566637 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-config-data\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.566690 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.566951 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c118ff62-66e3-4359-9122-ebc78d1a1f3d-run-httpd\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.567065 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c118ff62-66e3-4359-9122-ebc78d1a1f3d-log-httpd\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.571202 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.572687 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-scripts\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.572766 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-config-data\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.573118 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.573296 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c118ff62-66e3-4359-9122-ebc78d1a1f3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.583113 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzwl8\" (UniqueName: \"kubernetes.io/projected/c118ff62-66e3-4359-9122-ebc78d1a1f3d-kube-api-access-rzwl8\") pod \"ceilometer-0\" (UID: \"c118ff62-66e3-4359-9122-ebc78d1a1f3d\") " pod="openstack/ceilometer-0" Nov 29 05:47:32 crc kubenswrapper[4594]: I1129 05:47:32.680661 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 05:47:33 crc kubenswrapper[4594]: I1129 05:47:33.106243 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 05:47:33 crc kubenswrapper[4594]: W1129 05:47:33.112030 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc118ff62_66e3_4359_9122_ebc78d1a1f3d.slice/crio-f7d56f577abcc3238d2f362a27d6f4c3c22f3fbce1c4677ef49b9f88c91abc45 WatchSource:0}: Error finding container f7d56f577abcc3238d2f362a27d6f4c3c22f3fbce1c4677ef49b9f88c91abc45: Status 404 returned error can't find the container with id f7d56f577abcc3238d2f362a27d6f4c3c22f3fbce1c4677ef49b9f88c91abc45 Nov 29 05:47:33 crc kubenswrapper[4594]: I1129 05:47:33.294939 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c118ff62-66e3-4359-9122-ebc78d1a1f3d","Type":"ContainerStarted","Data":"f7d56f577abcc3238d2f362a27d6f4c3c22f3fbce1c4677ef49b9f88c91abc45"} Nov 29 05:47:34 crc kubenswrapper[4594]: I1129 05:47:34.095638 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e52f06c3-9f29-498b-8518-a31673a30616" path="/var/lib/kubelet/pods/e52f06c3-9f29-498b-8518-a31673a30616/volumes" Nov 29 05:47:34 crc kubenswrapper[4594]: I1129 05:47:34.310358 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c118ff62-66e3-4359-9122-ebc78d1a1f3d","Type":"ContainerStarted","Data":"240bc9fb5f3822f96efc682915db82a7dedbb5693cfb5544264d00ccc35eafb8"} Nov 29 05:47:35 crc kubenswrapper[4594]: I1129 05:47:35.321578 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c118ff62-66e3-4359-9122-ebc78d1a1f3d","Type":"ContainerStarted","Data":"3de5a6fab71eb4654c91a6ee7896df3925155caa7baadb20517042bb157e78db"} Nov 29 05:47:35 crc kubenswrapper[4594]: I1129 05:47:35.813539 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:47:35 crc kubenswrapper[4594]: I1129 05:47:35.871553 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-56pgf"] Nov 29 05:47:35 crc kubenswrapper[4594]: I1129 05:47:35.871810 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" podUID="e74d9bfd-1e04-4878-aa71-24403f17ebf5" containerName="dnsmasq-dns" containerID="cri-o://71f23430873f351346ad0fc58f3e7b87545ce09c86854ea28d294c2767be8965" gracePeriod=10 Nov 29 05:47:35 crc kubenswrapper[4594]: I1129 05:47:35.886142 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:35 crc kubenswrapper[4594]: I1129 05:47:35.911041 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.337976 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.340632 4594 generic.go:334] "Generic (PLEG): container finished" podID="e74d9bfd-1e04-4878-aa71-24403f17ebf5" containerID="71f23430873f351346ad0fc58f3e7b87545ce09c86854ea28d294c2767be8965" exitCode=0 Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.340679 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" event={"ID":"e74d9bfd-1e04-4878-aa71-24403f17ebf5","Type":"ContainerDied","Data":"71f23430873f351346ad0fc58f3e7b87545ce09c86854ea28d294c2767be8965"} Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.340770 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" event={"ID":"e74d9bfd-1e04-4878-aa71-24403f17ebf5","Type":"ContainerDied","Data":"1c54ffbd04d1fc68aedb86cd5bff257adbd4b451e426f6b0490f667397aa9055"} Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.340798 4594 scope.go:117] "RemoveContainer" containerID="71f23430873f351346ad0fc58f3e7b87545ce09c86854ea28d294c2767be8965" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.347850 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c118ff62-66e3-4359-9122-ebc78d1a1f3d","Type":"ContainerStarted","Data":"030d3ada0e13688a06d0c2661475270187f2a63035000fa8e12615efbf7e4ca9"} Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.375143 4594 scope.go:117] "RemoveContainer" containerID="267c596333c00cfa2ea8c7df63c3090b2225c0154df289306cac3a221dcc521e" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.391764 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.428754 4594 scope.go:117] "RemoveContainer" containerID="71f23430873f351346ad0fc58f3e7b87545ce09c86854ea28d294c2767be8965" Nov 29 05:47:36 crc kubenswrapper[4594]: E1129 05:47:36.429602 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71f23430873f351346ad0fc58f3e7b87545ce09c86854ea28d294c2767be8965\": container with ID starting with 71f23430873f351346ad0fc58f3e7b87545ce09c86854ea28d294c2767be8965 not found: ID does not exist" containerID="71f23430873f351346ad0fc58f3e7b87545ce09c86854ea28d294c2767be8965" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.429659 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71f23430873f351346ad0fc58f3e7b87545ce09c86854ea28d294c2767be8965"} err="failed to get container status \"71f23430873f351346ad0fc58f3e7b87545ce09c86854ea28d294c2767be8965\": rpc error: code = NotFound desc = could not find container \"71f23430873f351346ad0fc58f3e7b87545ce09c86854ea28d294c2767be8965\": container with ID starting with 71f23430873f351346ad0fc58f3e7b87545ce09c86854ea28d294c2767be8965 not found: ID does not exist" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.429689 4594 scope.go:117] "RemoveContainer" containerID="267c596333c00cfa2ea8c7df63c3090b2225c0154df289306cac3a221dcc521e" Nov 29 05:47:36 crc kubenswrapper[4594]: E1129 05:47:36.430082 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"267c596333c00cfa2ea8c7df63c3090b2225c0154df289306cac3a221dcc521e\": container with ID starting with 267c596333c00cfa2ea8c7df63c3090b2225c0154df289306cac3a221dcc521e not found: ID does not exist" containerID="267c596333c00cfa2ea8c7df63c3090b2225c0154df289306cac3a221dcc521e" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.430110 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"267c596333c00cfa2ea8c7df63c3090b2225c0154df289306cac3a221dcc521e"} err="failed to get container status \"267c596333c00cfa2ea8c7df63c3090b2225c0154df289306cac3a221dcc521e\": rpc error: code = NotFound desc = could not find container \"267c596333c00cfa2ea8c7df63c3090b2225c0154df289306cac3a221dcc521e\": container with ID starting with 267c596333c00cfa2ea8c7df63c3090b2225c0154df289306cac3a221dcc521e not found: ID does not exist" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.467157 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-ovsdbserver-sb\") pod \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.467325 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxhcw\" (UniqueName: \"kubernetes.io/projected/e74d9bfd-1e04-4878-aa71-24403f17ebf5-kube-api-access-wxhcw\") pod \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.467428 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-dns-swift-storage-0\") pod \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.467670 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-ovsdbserver-nb\") pod \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.468019 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-dns-svc\") pod \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.468138 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-config\") pod \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\" (UID: \"e74d9bfd-1e04-4878-aa71-24403f17ebf5\") " Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.481419 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e74d9bfd-1e04-4878-aa71-24403f17ebf5-kube-api-access-wxhcw" (OuterVolumeSpecName: "kube-api-access-wxhcw") pod "e74d9bfd-1e04-4878-aa71-24403f17ebf5" (UID: "e74d9bfd-1e04-4878-aa71-24403f17ebf5"). InnerVolumeSpecName "kube-api-access-wxhcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.530147 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-config" (OuterVolumeSpecName: "config") pod "e74d9bfd-1e04-4878-aa71-24403f17ebf5" (UID: "e74d9bfd-1e04-4878-aa71-24403f17ebf5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.530411 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e74d9bfd-1e04-4878-aa71-24403f17ebf5" (UID: "e74d9bfd-1e04-4878-aa71-24403f17ebf5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.544673 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e74d9bfd-1e04-4878-aa71-24403f17ebf5" (UID: "e74d9bfd-1e04-4878-aa71-24403f17ebf5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.559807 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e74d9bfd-1e04-4878-aa71-24403f17ebf5" (UID: "e74d9bfd-1e04-4878-aa71-24403f17ebf5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.567613 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6vm99"] Nov 29 05:47:36 crc kubenswrapper[4594]: E1129 05:47:36.568169 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74d9bfd-1e04-4878-aa71-24403f17ebf5" containerName="dnsmasq-dns" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.568189 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74d9bfd-1e04-4878-aa71-24403f17ebf5" containerName="dnsmasq-dns" Nov 29 05:47:36 crc kubenswrapper[4594]: E1129 05:47:36.568206 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74d9bfd-1e04-4878-aa71-24403f17ebf5" containerName="init" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.568213 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74d9bfd-1e04-4878-aa71-24403f17ebf5" containerName="init" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.568717 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e74d9bfd-1e04-4878-aa71-24403f17ebf5" containerName="dnsmasq-dns" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.570159 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.571751 4594 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.571781 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.571792 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.571802 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.571812 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxhcw\" (UniqueName: \"kubernetes.io/projected/e74d9bfd-1e04-4878-aa71-24403f17ebf5-kube-api-access-wxhcw\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.572134 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.572302 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.582229 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6vm99"] Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.583383 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e74d9bfd-1e04-4878-aa71-24403f17ebf5" (UID: "e74d9bfd-1e04-4878-aa71-24403f17ebf5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.674536 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-scripts\") pod \"nova-cell1-cell-mapping-6vm99\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.674949 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-config-data\") pod \"nova-cell1-cell-mapping-6vm99\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.675040 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6vm99\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.675224 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lscn\" (UniqueName: \"kubernetes.io/projected/a289c272-abe9-4098-951e-d6a10ce647ab-kube-api-access-2lscn\") pod \"nova-cell1-cell-mapping-6vm99\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.675527 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e74d9bfd-1e04-4878-aa71-24403f17ebf5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.777008 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lscn\" (UniqueName: \"kubernetes.io/projected/a289c272-abe9-4098-951e-d6a10ce647ab-kube-api-access-2lscn\") pod \"nova-cell1-cell-mapping-6vm99\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.777107 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-scripts\") pod \"nova-cell1-cell-mapping-6vm99\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.777200 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-config-data\") pod \"nova-cell1-cell-mapping-6vm99\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.777229 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6vm99\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.781543 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-scripts\") pod \"nova-cell1-cell-mapping-6vm99\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.783203 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-config-data\") pod \"nova-cell1-cell-mapping-6vm99\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.783637 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6vm99\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.797551 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lscn\" (UniqueName: \"kubernetes.io/projected/a289c272-abe9-4098-951e-d6a10ce647ab-kube-api-access-2lscn\") pod \"nova-cell1-cell-mapping-6vm99\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:36 crc kubenswrapper[4594]: I1129 05:47:36.889126 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:37 crc kubenswrapper[4594]: I1129 05:47:37.334950 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6vm99"] Nov 29 05:47:37 crc kubenswrapper[4594]: I1129 05:47:37.372166 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c118ff62-66e3-4359-9122-ebc78d1a1f3d","Type":"ContainerStarted","Data":"5ff5c76284547d21007ddf9d6ada37c0395ae3c4744d87bd8f4883ebd3a659e5"} Nov 29 05:47:37 crc kubenswrapper[4594]: I1129 05:47:37.372652 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 05:47:37 crc kubenswrapper[4594]: I1129 05:47:37.378969 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6vm99" event={"ID":"a289c272-abe9-4098-951e-d6a10ce647ab","Type":"ContainerStarted","Data":"11c392f54fabbf11507a89c616953d85aa65e91dc8fcb3c116cdd34fe56d5dcd"} Nov 29 05:47:37 crc kubenswrapper[4594]: I1129 05:47:37.382796 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fc57f6f-56pgf" Nov 29 05:47:37 crc kubenswrapper[4594]: I1129 05:47:37.391587 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.38770547 podStartE2EDuration="5.391568894s" podCreationTimestamp="2025-11-29 05:47:32 +0000 UTC" firstStartedPulling="2025-11-29 05:47:33.115728416 +0000 UTC m=+1177.356237636" lastFinishedPulling="2025-11-29 05:47:37.11959184 +0000 UTC m=+1181.360101060" observedRunningTime="2025-11-29 05:47:37.38860461 +0000 UTC m=+1181.629113830" watchObservedRunningTime="2025-11-29 05:47:37.391568894 +0000 UTC m=+1181.632078115" Nov 29 05:47:37 crc kubenswrapper[4594]: I1129 05:47:37.440215 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-56pgf"] Nov 29 05:47:37 crc kubenswrapper[4594]: I1129 05:47:37.450211 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-56pgf"] Nov 29 05:47:38 crc kubenswrapper[4594]: I1129 05:47:38.094515 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e74d9bfd-1e04-4878-aa71-24403f17ebf5" path="/var/lib/kubelet/pods/e74d9bfd-1e04-4878-aa71-24403f17ebf5/volumes" Nov 29 05:47:38 crc kubenswrapper[4594]: I1129 05:47:38.397335 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6vm99" event={"ID":"a289c272-abe9-4098-951e-d6a10ce647ab","Type":"ContainerStarted","Data":"3b020809d6a5dc264b4813f8f63cbb564324adb8cb346eaff676dea96997bb5a"} Nov 29 05:47:38 crc kubenswrapper[4594]: I1129 05:47:38.414547 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6vm99" podStartSLOduration=2.414522371 podStartE2EDuration="2.414522371s" podCreationTimestamp="2025-11-29 05:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:47:38.412604264 +0000 UTC m=+1182.653113484" watchObservedRunningTime="2025-11-29 05:47:38.414522371 +0000 UTC m=+1182.655031591" Nov 29 05:47:40 crc kubenswrapper[4594]: I1129 05:47:40.606798 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 05:47:40 crc kubenswrapper[4594]: I1129 05:47:40.608991 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 05:47:41 crc kubenswrapper[4594]: I1129 05:47:41.618469 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d02a9e40-3da0-4d17-8694-4f67b1bb50b6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 05:47:41 crc kubenswrapper[4594]: I1129 05:47:41.629386 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d02a9e40-3da0-4d17-8694-4f67b1bb50b6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 05:47:42 crc kubenswrapper[4594]: I1129 05:47:42.463794 4594 generic.go:334] "Generic (PLEG): container finished" podID="a289c272-abe9-4098-951e-d6a10ce647ab" containerID="3b020809d6a5dc264b4813f8f63cbb564324adb8cb346eaff676dea96997bb5a" exitCode=0 Nov 29 05:47:42 crc kubenswrapper[4594]: I1129 05:47:42.463846 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6vm99" event={"ID":"a289c272-abe9-4098-951e-d6a10ce647ab","Type":"ContainerDied","Data":"3b020809d6a5dc264b4813f8f63cbb564324adb8cb346eaff676dea96997bb5a"} Nov 29 05:47:43 crc kubenswrapper[4594]: I1129 05:47:43.790784 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:43 crc kubenswrapper[4594]: I1129 05:47:43.947769 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-config-data\") pod \"a289c272-abe9-4098-951e-d6a10ce647ab\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " Nov 29 05:47:43 crc kubenswrapper[4594]: I1129 05:47:43.948323 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-scripts\") pod \"a289c272-abe9-4098-951e-d6a10ce647ab\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " Nov 29 05:47:43 crc kubenswrapper[4594]: I1129 05:47:43.948714 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-combined-ca-bundle\") pod \"a289c272-abe9-4098-951e-d6a10ce647ab\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " Nov 29 05:47:43 crc kubenswrapper[4594]: I1129 05:47:43.948895 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lscn\" (UniqueName: \"kubernetes.io/projected/a289c272-abe9-4098-951e-d6a10ce647ab-kube-api-access-2lscn\") pod \"a289c272-abe9-4098-951e-d6a10ce647ab\" (UID: \"a289c272-abe9-4098-951e-d6a10ce647ab\") " Nov 29 05:47:43 crc kubenswrapper[4594]: I1129 05:47:43.954731 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a289c272-abe9-4098-951e-d6a10ce647ab-kube-api-access-2lscn" (OuterVolumeSpecName: "kube-api-access-2lscn") pod "a289c272-abe9-4098-951e-d6a10ce647ab" (UID: "a289c272-abe9-4098-951e-d6a10ce647ab"). InnerVolumeSpecName "kube-api-access-2lscn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:43 crc kubenswrapper[4594]: I1129 05:47:43.954868 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-scripts" (OuterVolumeSpecName: "scripts") pod "a289c272-abe9-4098-951e-d6a10ce647ab" (UID: "a289c272-abe9-4098-951e-d6a10ce647ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:43 crc kubenswrapper[4594]: I1129 05:47:43.974865 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-config-data" (OuterVolumeSpecName: "config-data") pod "a289c272-abe9-4098-951e-d6a10ce647ab" (UID: "a289c272-abe9-4098-951e-d6a10ce647ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:43 crc kubenswrapper[4594]: I1129 05:47:43.976057 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a289c272-abe9-4098-951e-d6a10ce647ab" (UID: "a289c272-abe9-4098-951e-d6a10ce647ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.052634 4594 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.052787 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.052857 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lscn\" (UniqueName: \"kubernetes.io/projected/a289c272-abe9-4098-951e-d6a10ce647ab-kube-api-access-2lscn\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.052908 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a289c272-abe9-4098-951e-d6a10ce647ab-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.488387 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6vm99" event={"ID":"a289c272-abe9-4098-951e-d6a10ce647ab","Type":"ContainerDied","Data":"11c392f54fabbf11507a89c616953d85aa65e91dc8fcb3c116cdd34fe56d5dcd"} Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.488444 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6vm99" Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.488456 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11c392f54fabbf11507a89c616953d85aa65e91dc8fcb3c116cdd34fe56d5dcd" Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.565836 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.566325 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d02a9e40-3da0-4d17-8694-4f67b1bb50b6" containerName="nova-api-log" containerID="cri-o://a3f24e5456d2f677492f6d792212f1682e5d00bcff4ac9cc9dfe9b59ba6bfcfa" gracePeriod=30 Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.566420 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d02a9e40-3da0-4d17-8694-4f67b1bb50b6" containerName="nova-api-api" containerID="cri-o://09e82956123a0bdf123549891b98d61d5c628e8cbd806667e129f60b90cedcf3" gracePeriod=30 Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.580998 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.581245 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0e1f7ae3-c18c-4c09-aab4-d8f30450b730" containerName="nova-scheduler-scheduler" containerID="cri-o://10ca4ad4b70b3c49c150138aaf360fbe30110a2357b932311894d8989130e937" gracePeriod=30 Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.615501 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.615962 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="241daf99-de92-4580-bec5-138d9356b784" containerName="nova-metadata-log" containerID="cri-o://e9f13975e833b1e2021391fb9c17d6e7a6d0ede6b1bbc0a86daeebe15b1087f4" gracePeriod=30 Nov 29 05:47:44 crc kubenswrapper[4594]: I1129 05:47:44.616331 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="241daf99-de92-4580-bec5-138d9356b784" containerName="nova-metadata-metadata" containerID="cri-o://9f48c153d0e592cdd9839d42d29322fa2a29306b09ab03122a334e58a2841e1f" gracePeriod=30 Nov 29 05:47:45 crc kubenswrapper[4594]: I1129 05:47:45.507005 4594 generic.go:334] "Generic (PLEG): container finished" podID="241daf99-de92-4580-bec5-138d9356b784" containerID="e9f13975e833b1e2021391fb9c17d6e7a6d0ede6b1bbc0a86daeebe15b1087f4" exitCode=143 Nov 29 05:47:45 crc kubenswrapper[4594]: I1129 05:47:45.507097 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"241daf99-de92-4580-bec5-138d9356b784","Type":"ContainerDied","Data":"e9f13975e833b1e2021391fb9c17d6e7a6d0ede6b1bbc0a86daeebe15b1087f4"} Nov 29 05:47:45 crc kubenswrapper[4594]: I1129 05:47:45.511544 4594 generic.go:334] "Generic (PLEG): container finished" podID="d02a9e40-3da0-4d17-8694-4f67b1bb50b6" containerID="09e82956123a0bdf123549891b98d61d5c628e8cbd806667e129f60b90cedcf3" exitCode=0 Nov 29 05:47:45 crc kubenswrapper[4594]: I1129 05:47:45.511600 4594 generic.go:334] "Generic (PLEG): container finished" podID="d02a9e40-3da0-4d17-8694-4f67b1bb50b6" containerID="a3f24e5456d2f677492f6d792212f1682e5d00bcff4ac9cc9dfe9b59ba6bfcfa" exitCode=143 Nov 29 05:47:45 crc kubenswrapper[4594]: I1129 05:47:45.511634 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d02a9e40-3da0-4d17-8694-4f67b1bb50b6","Type":"ContainerDied","Data":"09e82956123a0bdf123549891b98d61d5c628e8cbd806667e129f60b90cedcf3"} Nov 29 05:47:45 crc kubenswrapper[4594]: I1129 05:47:45.511669 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d02a9e40-3da0-4d17-8694-4f67b1bb50b6","Type":"ContainerDied","Data":"a3f24e5456d2f677492f6d792212f1682e5d00bcff4ac9cc9dfe9b59ba6bfcfa"} Nov 29 05:47:45 crc kubenswrapper[4594]: I1129 05:47:45.947078 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:47:45 crc kubenswrapper[4594]: I1129 05:47:45.954552 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.122997 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-nova-metadata-tls-certs\") pod \"241daf99-de92-4580-bec5-138d9356b784\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.123132 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-logs\") pod \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.123268 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5msjf\" (UniqueName: \"kubernetes.io/projected/241daf99-de92-4580-bec5-138d9356b784-kube-api-access-5msjf\") pod \"241daf99-de92-4580-bec5-138d9356b784\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.123321 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-config-data\") pod \"241daf99-de92-4580-bec5-138d9356b784\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.123366 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-combined-ca-bundle\") pod \"241daf99-de92-4580-bec5-138d9356b784\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.123428 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-public-tls-certs\") pod \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.123443 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m24fx\" (UniqueName: \"kubernetes.io/projected/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-kube-api-access-m24fx\") pod \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.123499 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241daf99-de92-4580-bec5-138d9356b784-logs\") pod \"241daf99-de92-4580-bec5-138d9356b784\" (UID: \"241daf99-de92-4580-bec5-138d9356b784\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.123535 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-internal-tls-certs\") pod \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.123559 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-combined-ca-bundle\") pod \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.123610 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-config-data\") pod \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\" (UID: \"d02a9e40-3da0-4d17-8694-4f67b1bb50b6\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.128474 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-logs" (OuterVolumeSpecName: "logs") pod "d02a9e40-3da0-4d17-8694-4f67b1bb50b6" (UID: "d02a9e40-3da0-4d17-8694-4f67b1bb50b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.131681 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/241daf99-de92-4580-bec5-138d9356b784-logs" (OuterVolumeSpecName: "logs") pod "241daf99-de92-4580-bec5-138d9356b784" (UID: "241daf99-de92-4580-bec5-138d9356b784"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.167393 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241daf99-de92-4580-bec5-138d9356b784-kube-api-access-5msjf" (OuterVolumeSpecName: "kube-api-access-5msjf") pod "241daf99-de92-4580-bec5-138d9356b784" (UID: "241daf99-de92-4580-bec5-138d9356b784"). InnerVolumeSpecName "kube-api-access-5msjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.168410 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-kube-api-access-m24fx" (OuterVolumeSpecName: "kube-api-access-m24fx") pod "d02a9e40-3da0-4d17-8694-4f67b1bb50b6" (UID: "d02a9e40-3da0-4d17-8694-4f67b1bb50b6"). InnerVolumeSpecName "kube-api-access-m24fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.226179 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241daf99-de92-4580-bec5-138d9356b784-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.226506 4594 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-logs\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.226522 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5msjf\" (UniqueName: \"kubernetes.io/projected/241daf99-de92-4580-bec5-138d9356b784-kube-api-access-5msjf\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.226536 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m24fx\" (UniqueName: \"kubernetes.io/projected/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-kube-api-access-m24fx\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.227535 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "241daf99-de92-4580-bec5-138d9356b784" (UID: "241daf99-de92-4580-bec5-138d9356b784"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.239433 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-config-data" (OuterVolumeSpecName: "config-data") pod "d02a9e40-3da0-4d17-8694-4f67b1bb50b6" (UID: "d02a9e40-3da0-4d17-8694-4f67b1bb50b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.289619 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d02a9e40-3da0-4d17-8694-4f67b1bb50b6" (UID: "d02a9e40-3da0-4d17-8694-4f67b1bb50b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.289899 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "241daf99-de92-4580-bec5-138d9356b784" (UID: "241daf99-de92-4580-bec5-138d9356b784"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.290422 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-config-data" (OuterVolumeSpecName: "config-data") pod "241daf99-de92-4580-bec5-138d9356b784" (UID: "241daf99-de92-4580-bec5-138d9356b784"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.313652 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d02a9e40-3da0-4d17-8694-4f67b1bb50b6" (UID: "d02a9e40-3da0-4d17-8694-4f67b1bb50b6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.314161 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d02a9e40-3da0-4d17-8694-4f67b1bb50b6" (UID: "d02a9e40-3da0-4d17-8694-4f67b1bb50b6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.328712 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.328739 4594 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.328750 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.328761 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241daf99-de92-4580-bec5-138d9356b784-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.328772 4594 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.328781 4594 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.328790 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02a9e40-3da0-4d17-8694-4f67b1bb50b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.525864 4594 generic.go:334] "Generic (PLEG): container finished" podID="0e1f7ae3-c18c-4c09-aab4-d8f30450b730" containerID="10ca4ad4b70b3c49c150138aaf360fbe30110a2357b932311894d8989130e937" exitCode=0 Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.526038 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e1f7ae3-c18c-4c09-aab4-d8f30450b730","Type":"ContainerDied","Data":"10ca4ad4b70b3c49c150138aaf360fbe30110a2357b932311894d8989130e937"} Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.529968 4594 generic.go:334] "Generic (PLEG): container finished" podID="241daf99-de92-4580-bec5-138d9356b784" containerID="9f48c153d0e592cdd9839d42d29322fa2a29306b09ab03122a334e58a2841e1f" exitCode=0 Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.530066 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"241daf99-de92-4580-bec5-138d9356b784","Type":"ContainerDied","Data":"9f48c153d0e592cdd9839d42d29322fa2a29306b09ab03122a334e58a2841e1f"} Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.530111 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"241daf99-de92-4580-bec5-138d9356b784","Type":"ContainerDied","Data":"3427377646fe8af6965886edf00e3197e65ce8fd41000e0d76ffb3f74835e053"} Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.530145 4594 scope.go:117] "RemoveContainer" containerID="9f48c153d0e592cdd9839d42d29322fa2a29306b09ab03122a334e58a2841e1f" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.530292 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.533175 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d02a9e40-3da0-4d17-8694-4f67b1bb50b6","Type":"ContainerDied","Data":"e51d1993b3c44d2be750f41405d59f5b5ff990c76c2f9636ccb0f78adc84c6c5"} Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.533322 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.568487 4594 scope.go:117] "RemoveContainer" containerID="e9f13975e833b1e2021391fb9c17d6e7a6d0ede6b1bbc0a86daeebe15b1087f4" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.571288 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.579753 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.586624 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.596465 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.596814 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.600480 4594 scope.go:117] "RemoveContainer" containerID="9f48c153d0e592cdd9839d42d29322fa2a29306b09ab03122a334e58a2841e1f" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.601930 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:47:46 crc kubenswrapper[4594]: E1129 05:47:46.602339 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02a9e40-3da0-4d17-8694-4f67b1bb50b6" containerName="nova-api-api" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.602352 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02a9e40-3da0-4d17-8694-4f67b1bb50b6" containerName="nova-api-api" Nov 29 05:47:46 crc kubenswrapper[4594]: E1129 05:47:46.602377 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02a9e40-3da0-4d17-8694-4f67b1bb50b6" containerName="nova-api-log" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.602383 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02a9e40-3da0-4d17-8694-4f67b1bb50b6" containerName="nova-api-log" Nov 29 05:47:46 crc kubenswrapper[4594]: E1129 05:47:46.602391 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a289c272-abe9-4098-951e-d6a10ce647ab" containerName="nova-manage" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.602397 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="a289c272-abe9-4098-951e-d6a10ce647ab" containerName="nova-manage" Nov 29 05:47:46 crc kubenswrapper[4594]: E1129 05:47:46.602425 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e1f7ae3-c18c-4c09-aab4-d8f30450b730" containerName="nova-scheduler-scheduler" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.602431 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1f7ae3-c18c-4c09-aab4-d8f30450b730" containerName="nova-scheduler-scheduler" Nov 29 05:47:46 crc kubenswrapper[4594]: E1129 05:47:46.602440 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241daf99-de92-4580-bec5-138d9356b784" containerName="nova-metadata-metadata" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.602445 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="241daf99-de92-4580-bec5-138d9356b784" containerName="nova-metadata-metadata" Nov 29 05:47:46 crc kubenswrapper[4594]: E1129 05:47:46.602452 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241daf99-de92-4580-bec5-138d9356b784" containerName="nova-metadata-log" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.602459 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="241daf99-de92-4580-bec5-138d9356b784" containerName="nova-metadata-log" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.602661 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="241daf99-de92-4580-bec5-138d9356b784" containerName="nova-metadata-metadata" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.602681 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e1f7ae3-c18c-4c09-aab4-d8f30450b730" containerName="nova-scheduler-scheduler" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.602691 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="241daf99-de92-4580-bec5-138d9356b784" containerName="nova-metadata-log" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.602699 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02a9e40-3da0-4d17-8694-4f67b1bb50b6" containerName="nova-api-api" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.602709 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="a289c272-abe9-4098-951e-d6a10ce647ab" containerName="nova-manage" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.602718 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02a9e40-3da0-4d17-8694-4f67b1bb50b6" containerName="nova-api-log" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.604839 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: E1129 05:47:46.605500 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f48c153d0e592cdd9839d42d29322fa2a29306b09ab03122a334e58a2841e1f\": container with ID starting with 9f48c153d0e592cdd9839d42d29322fa2a29306b09ab03122a334e58a2841e1f not found: ID does not exist" containerID="9f48c153d0e592cdd9839d42d29322fa2a29306b09ab03122a334e58a2841e1f" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.605531 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f48c153d0e592cdd9839d42d29322fa2a29306b09ab03122a334e58a2841e1f"} err="failed to get container status \"9f48c153d0e592cdd9839d42d29322fa2a29306b09ab03122a334e58a2841e1f\": rpc error: code = NotFound desc = could not find container \"9f48c153d0e592cdd9839d42d29322fa2a29306b09ab03122a334e58a2841e1f\": container with ID starting with 9f48c153d0e592cdd9839d42d29322fa2a29306b09ab03122a334e58a2841e1f not found: ID does not exist" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.605553 4594 scope.go:117] "RemoveContainer" containerID="e9f13975e833b1e2021391fb9c17d6e7a6d0ede6b1bbc0a86daeebe15b1087f4" Nov 29 05:47:46 crc kubenswrapper[4594]: E1129 05:47:46.605971 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9f13975e833b1e2021391fb9c17d6e7a6d0ede6b1bbc0a86daeebe15b1087f4\": container with ID starting with e9f13975e833b1e2021391fb9c17d6e7a6d0ede6b1bbc0a86daeebe15b1087f4 not found: ID does not exist" containerID="e9f13975e833b1e2021391fb9c17d6e7a6d0ede6b1bbc0a86daeebe15b1087f4" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.605997 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f13975e833b1e2021391fb9c17d6e7a6d0ede6b1bbc0a86daeebe15b1087f4"} err="failed to get container status \"e9f13975e833b1e2021391fb9c17d6e7a6d0ede6b1bbc0a86daeebe15b1087f4\": rpc error: code = NotFound desc = could not find container \"e9f13975e833b1e2021391fb9c17d6e7a6d0ede6b1bbc0a86daeebe15b1087f4\": container with ID starting with e9f13975e833b1e2021391fb9c17d6e7a6d0ede6b1bbc0a86daeebe15b1087f4 not found: ID does not exist" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.606017 4594 scope.go:117] "RemoveContainer" containerID="09e82956123a0bdf123549891b98d61d5c628e8cbd806667e129f60b90cedcf3" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.615550 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.615704 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.616035 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.617305 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.619674 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.619853 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.634221 4594 scope.go:117] "RemoveContainer" containerID="a3f24e5456d2f677492f6d792212f1682e5d00bcff4ac9cc9dfe9b59ba6bfcfa" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.635921 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.636795 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.649734 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.735638 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-combined-ca-bundle\") pod \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\" (UID: \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.735739 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-config-data\") pod \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\" (UID: \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.735986 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jmzk\" (UniqueName: \"kubernetes.io/projected/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-kube-api-access-8jmzk\") pod \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\" (UID: \"0e1f7ae3-c18c-4c09-aab4-d8f30450b730\") " Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.736576 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.736635 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c168bec-2ad5-431a-ad8e-ef04de7635b4-config-data\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.736689 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c168bec-2ad5-431a-ad8e-ef04de7635b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.736713 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbrql\" (UniqueName: \"kubernetes.io/projected/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-kube-api-access-dbrql\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.736731 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.736799 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-config-data\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.736893 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c168bec-2ad5-431a-ad8e-ef04de7635b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.736965 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wm88\" (UniqueName: \"kubernetes.io/projected/7c168bec-2ad5-431a-ad8e-ef04de7635b4-kube-api-access-7wm88\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.737004 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-logs\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.737096 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c168bec-2ad5-431a-ad8e-ef04de7635b4-logs\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.737165 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-public-tls-certs\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.738804 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-kube-api-access-8jmzk" (OuterVolumeSpecName: "kube-api-access-8jmzk") pod "0e1f7ae3-c18c-4c09-aab4-d8f30450b730" (UID: "0e1f7ae3-c18c-4c09-aab4-d8f30450b730"). InnerVolumeSpecName "kube-api-access-8jmzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.757391 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-config-data" (OuterVolumeSpecName: "config-data") pod "0e1f7ae3-c18c-4c09-aab4-d8f30450b730" (UID: "0e1f7ae3-c18c-4c09-aab4-d8f30450b730"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.774175 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e1f7ae3-c18c-4c09-aab4-d8f30450b730" (UID: "0e1f7ae3-c18c-4c09-aab4-d8f30450b730"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.841801 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.841970 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c168bec-2ad5-431a-ad8e-ef04de7635b4-config-data\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.842107 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c168bec-2ad5-431a-ad8e-ef04de7635b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.842176 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbrql\" (UniqueName: \"kubernetes.io/projected/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-kube-api-access-dbrql\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.842281 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.842426 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-config-data\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.842533 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c168bec-2ad5-431a-ad8e-ef04de7635b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.842662 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wm88\" (UniqueName: \"kubernetes.io/projected/7c168bec-2ad5-431a-ad8e-ef04de7635b4-kube-api-access-7wm88\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.842749 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-logs\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.842919 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c168bec-2ad5-431a-ad8e-ef04de7635b4-logs\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.843043 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-public-tls-certs\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.843268 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jmzk\" (UniqueName: \"kubernetes.io/projected/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-kube-api-access-8jmzk\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.843451 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.843541 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1f7ae3-c18c-4c09-aab4-d8f30450b730-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.843371 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-logs\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.845285 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c168bec-2ad5-431a-ad8e-ef04de7635b4-logs\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.846960 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.848147 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c168bec-2ad5-431a-ad8e-ef04de7635b4-config-data\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.849870 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c168bec-2ad5-431a-ad8e-ef04de7635b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.850678 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.850767 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-config-data\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.856506 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-public-tls-certs\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.858722 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c168bec-2ad5-431a-ad8e-ef04de7635b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.860728 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbrql\" (UniqueName: \"kubernetes.io/projected/76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e-kube-api-access-dbrql\") pod \"nova-api-0\" (UID: \"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e\") " pod="openstack/nova-api-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.861590 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wm88\" (UniqueName: \"kubernetes.io/projected/7c168bec-2ad5-431a-ad8e-ef04de7635b4-kube-api-access-7wm88\") pod \"nova-metadata-0\" (UID: \"7c168bec-2ad5-431a-ad8e-ef04de7635b4\") " pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.927644 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 05:47:46 crc kubenswrapper[4594]: I1129 05:47:46.936878 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.426531 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.438348 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 05:47:47 crc kubenswrapper[4594]: W1129 05:47:47.454073 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c168bec_2ad5_431a_ad8e_ef04de7635b4.slice/crio-990b1d7d5bab922e99afebe5d181f25bbaf1237c82f0459cdc427921c6008994 WatchSource:0}: Error finding container 990b1d7d5bab922e99afebe5d181f25bbaf1237c82f0459cdc427921c6008994: Status 404 returned error can't find the container with id 990b1d7d5bab922e99afebe5d181f25bbaf1237c82f0459cdc427921c6008994 Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.548959 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e","Type":"ContainerStarted","Data":"d62a422ac067d7d0c3bae032e5bcf978c1b845ee600af239302ed2d1f8443e37"} Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.551842 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e1f7ae3-c18c-4c09-aab4-d8f30450b730","Type":"ContainerDied","Data":"b8c7be565aba34d7f932e7eb25603adf96ade2deae8af18e88eaec06207caf0d"} Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.551885 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.551920 4594 scope.go:117] "RemoveContainer" containerID="10ca4ad4b70b3c49c150138aaf360fbe30110a2357b932311894d8989130e937" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.553731 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7c168bec-2ad5-431a-ad8e-ef04de7635b4","Type":"ContainerStarted","Data":"990b1d7d5bab922e99afebe5d181f25bbaf1237c82f0459cdc427921c6008994"} Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.596696 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.610413 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.624673 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.626208 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.626842 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.634430 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.668997 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae32608f-19a0-4825-8fab-36c89e217b50-config-data\") pod \"nova-scheduler-0\" (UID: \"ae32608f-19a0-4825-8fab-36c89e217b50\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.669057 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae32608f-19a0-4825-8fab-36c89e217b50-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ae32608f-19a0-4825-8fab-36c89e217b50\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.669315 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7vkm\" (UniqueName: \"kubernetes.io/projected/ae32608f-19a0-4825-8fab-36c89e217b50-kube-api-access-s7vkm\") pod \"nova-scheduler-0\" (UID: \"ae32608f-19a0-4825-8fab-36c89e217b50\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.772243 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae32608f-19a0-4825-8fab-36c89e217b50-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ae32608f-19a0-4825-8fab-36c89e217b50\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.772519 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7vkm\" (UniqueName: \"kubernetes.io/projected/ae32608f-19a0-4825-8fab-36c89e217b50-kube-api-access-s7vkm\") pod \"nova-scheduler-0\" (UID: \"ae32608f-19a0-4825-8fab-36c89e217b50\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.772561 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae32608f-19a0-4825-8fab-36c89e217b50-config-data\") pod \"nova-scheduler-0\" (UID: \"ae32608f-19a0-4825-8fab-36c89e217b50\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.776801 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae32608f-19a0-4825-8fab-36c89e217b50-config-data\") pod \"nova-scheduler-0\" (UID: \"ae32608f-19a0-4825-8fab-36c89e217b50\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.778838 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae32608f-19a0-4825-8fab-36c89e217b50-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ae32608f-19a0-4825-8fab-36c89e217b50\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.787940 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7vkm\" (UniqueName: \"kubernetes.io/projected/ae32608f-19a0-4825-8fab-36c89e217b50-kube-api-access-s7vkm\") pod \"nova-scheduler-0\" (UID: \"ae32608f-19a0-4825-8fab-36c89e217b50\") " pod="openstack/nova-scheduler-0" Nov 29 05:47:47 crc kubenswrapper[4594]: I1129 05:47:47.946997 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 05:47:48 crc kubenswrapper[4594]: I1129 05:47:48.096354 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e1f7ae3-c18c-4c09-aab4-d8f30450b730" path="/var/lib/kubelet/pods/0e1f7ae3-c18c-4c09-aab4-d8f30450b730/volumes" Nov 29 05:47:48 crc kubenswrapper[4594]: I1129 05:47:48.097314 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241daf99-de92-4580-bec5-138d9356b784" path="/var/lib/kubelet/pods/241daf99-de92-4580-bec5-138d9356b784/volumes" Nov 29 05:47:48 crc kubenswrapper[4594]: I1129 05:47:48.097922 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02a9e40-3da0-4d17-8694-4f67b1bb50b6" path="/var/lib/kubelet/pods/d02a9e40-3da0-4d17-8694-4f67b1bb50b6/volumes" Nov 29 05:47:48 crc kubenswrapper[4594]: I1129 05:47:48.364863 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 05:47:48 crc kubenswrapper[4594]: W1129 05:47:48.365330 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae32608f_19a0_4825_8fab_36c89e217b50.slice/crio-dddbb9fa48c043f44de92cdb538079cba4dabc4d3cc439a2fd1b43ca7a173fd1 WatchSource:0}: Error finding container dddbb9fa48c043f44de92cdb538079cba4dabc4d3cc439a2fd1b43ca7a173fd1: Status 404 returned error can't find the container with id dddbb9fa48c043f44de92cdb538079cba4dabc4d3cc439a2fd1b43ca7a173fd1 Nov 29 05:47:48 crc kubenswrapper[4594]: I1129 05:47:48.571465 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7c168bec-2ad5-431a-ad8e-ef04de7635b4","Type":"ContainerStarted","Data":"97ea61284217d208e4d3604f50e61c941dc8cf2ee0d98b75925912778c0556ca"} Nov 29 05:47:48 crc kubenswrapper[4594]: I1129 05:47:48.571671 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7c168bec-2ad5-431a-ad8e-ef04de7635b4","Type":"ContainerStarted","Data":"8e41870f6508f6a10d6766d0003ba009c16e5c363d6cdb02f9aa79b156f17a3d"} Nov 29 05:47:48 crc kubenswrapper[4594]: I1129 05:47:48.572930 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ae32608f-19a0-4825-8fab-36c89e217b50","Type":"ContainerStarted","Data":"3f1e9c4789c5ecd6eb98ad3b4132cec56964a90b3d56b6c5ad7504c65b8c7840"} Nov 29 05:47:48 crc kubenswrapper[4594]: I1129 05:47:48.572956 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ae32608f-19a0-4825-8fab-36c89e217b50","Type":"ContainerStarted","Data":"dddbb9fa48c043f44de92cdb538079cba4dabc4d3cc439a2fd1b43ca7a173fd1"} Nov 29 05:47:48 crc kubenswrapper[4594]: I1129 05:47:48.574743 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e","Type":"ContainerStarted","Data":"2e0b2eaedb3ddde8ad0ff7682066b4d8fd8c893784fe4660eef9f51eb13a628b"} Nov 29 05:47:48 crc kubenswrapper[4594]: I1129 05:47:48.574794 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e","Type":"ContainerStarted","Data":"8a5c47dd86abf18954cd36a5bfc72f503f4195d7f352252f7deeac79e2d1f1e1"} Nov 29 05:47:48 crc kubenswrapper[4594]: I1129 05:47:48.594013 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.594000119 podStartE2EDuration="2.594000119s" podCreationTimestamp="2025-11-29 05:47:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:47:48.585906918 +0000 UTC m=+1192.826416138" watchObservedRunningTime="2025-11-29 05:47:48.594000119 +0000 UTC m=+1192.834509339" Nov 29 05:47:48 crc kubenswrapper[4594]: I1129 05:47:48.618998 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.61897995 podStartE2EDuration="2.61897995s" podCreationTimestamp="2025-11-29 05:47:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:47:48.604744327 +0000 UTC m=+1192.845253546" watchObservedRunningTime="2025-11-29 05:47:48.61897995 +0000 UTC m=+1192.859489170" Nov 29 05:47:48 crc kubenswrapper[4594]: I1129 05:47:48.619545 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.619540383 podStartE2EDuration="1.619540383s" podCreationTimestamp="2025-11-29 05:47:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:47:48.615925816 +0000 UTC m=+1192.856435036" watchObservedRunningTime="2025-11-29 05:47:48.619540383 +0000 UTC m=+1192.860049604" Nov 29 05:47:51 crc kubenswrapper[4594]: I1129 05:47:51.928936 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 05:47:51 crc kubenswrapper[4594]: I1129 05:47:51.930417 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 05:47:52 crc kubenswrapper[4594]: I1129 05:47:52.947840 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 05:47:56 crc kubenswrapper[4594]: I1129 05:47:56.929037 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 05:47:56 crc kubenswrapper[4594]: I1129 05:47:56.929746 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 05:47:56 crc kubenswrapper[4594]: I1129 05:47:56.937711 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 05:47:56 crc kubenswrapper[4594]: I1129 05:47:56.937761 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 05:47:57 crc kubenswrapper[4594]: I1129 05:47:57.937494 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7c168bec-2ad5-431a-ad8e-ef04de7635b4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 05:47:57 crc kubenswrapper[4594]: I1129 05:47:57.947941 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 29 05:47:57 crc kubenswrapper[4594]: I1129 05:47:57.957490 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 05:47:57 crc kubenswrapper[4594]: I1129 05:47:57.957510 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7c168bec-2ad5-431a-ad8e-ef04de7635b4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 05:47:57 crc kubenswrapper[4594]: I1129 05:47:57.957637 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 05:47:57 crc kubenswrapper[4594]: I1129 05:47:57.976804 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 29 05:47:58 crc kubenswrapper[4594]: I1129 05:47:58.711352 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 29 05:48:02 crc kubenswrapper[4594]: I1129 05:48:02.688449 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 29 05:48:06 crc kubenswrapper[4594]: I1129 05:48:06.934368 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 05:48:06 crc kubenswrapper[4594]: I1129 05:48:06.935319 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 05:48:06 crc kubenswrapper[4594]: I1129 05:48:06.939735 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 05:48:06 crc kubenswrapper[4594]: I1129 05:48:06.945562 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 05:48:06 crc kubenswrapper[4594]: I1129 05:48:06.945958 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 05:48:06 crc kubenswrapper[4594]: I1129 05:48:06.946577 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 05:48:06 crc kubenswrapper[4594]: I1129 05:48:06.955992 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 05:48:07 crc kubenswrapper[4594]: I1129 05:48:07.812708 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 05:48:07 crc kubenswrapper[4594]: I1129 05:48:07.817717 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 05:48:07 crc kubenswrapper[4594]: I1129 05:48:07.820116 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 05:48:14 crc kubenswrapper[4594]: I1129 05:48:14.638387 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 05:48:15 crc kubenswrapper[4594]: I1129 05:48:15.661019 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 05:48:15 crc kubenswrapper[4594]: I1129 05:48:15.800110 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:48:15 crc kubenswrapper[4594]: I1129 05:48:15.800175 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:48:17 crc kubenswrapper[4594]: I1129 05:48:17.794325 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="1ff736d8-8719-402e-95c9-1d790c1dff5e" containerName="rabbitmq" containerID="cri-o://3a62ad72db403bb720849a88b9b2000a5a92e83e4cbb2f26b9b5510384634882" gracePeriod=604797 Nov 29 05:48:18 crc kubenswrapper[4594]: I1129 05:48:18.231068 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="1ff736d8-8719-402e-95c9-1d790c1dff5e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Nov 29 05:48:18 crc kubenswrapper[4594]: I1129 05:48:18.754504 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="47b6950d-0d97-486d-aaec-2a9eaaf74027" containerName="rabbitmq" containerID="cri-o://8d71481afa04edc87a81fb7329f95a03dd0d6110ec5d8956af3e7a934aa4fb1c" gracePeriod=604797 Nov 29 05:48:18 crc kubenswrapper[4594]: I1129 05:48:18.763596 4594 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="47b6950d-0d97-486d-aaec-2a9eaaf74027" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.444544 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.457486 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-confd\") pod \"1ff736d8-8719-402e-95c9-1d790c1dff5e\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.457540 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ff736d8-8719-402e-95c9-1d790c1dff5e-pod-info\") pod \"1ff736d8-8719-402e-95c9-1d790c1dff5e\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.457594 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-server-conf\") pod \"1ff736d8-8719-402e-95c9-1d790c1dff5e\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.457613 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-config-data\") pod \"1ff736d8-8719-402e-95c9-1d790c1dff5e\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.457635 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ff736d8-8719-402e-95c9-1d790c1dff5e-erlang-cookie-secret\") pod \"1ff736d8-8719-402e-95c9-1d790c1dff5e\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.457653 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbbsd\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-kube-api-access-wbbsd\") pod \"1ff736d8-8719-402e-95c9-1d790c1dff5e\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.457670 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"1ff736d8-8719-402e-95c9-1d790c1dff5e\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.457693 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-plugins-conf\") pod \"1ff736d8-8719-402e-95c9-1d790c1dff5e\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.457711 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-tls\") pod \"1ff736d8-8719-402e-95c9-1d790c1dff5e\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.457730 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-plugins\") pod \"1ff736d8-8719-402e-95c9-1d790c1dff5e\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.457776 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-erlang-cookie\") pod \"1ff736d8-8719-402e-95c9-1d790c1dff5e\" (UID: \"1ff736d8-8719-402e-95c9-1d790c1dff5e\") " Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.459280 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1ff736d8-8719-402e-95c9-1d790c1dff5e" (UID: "1ff736d8-8719-402e-95c9-1d790c1dff5e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.459776 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1ff736d8-8719-402e-95c9-1d790c1dff5e" (UID: "1ff736d8-8719-402e-95c9-1d790c1dff5e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.460332 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1ff736d8-8719-402e-95c9-1d790c1dff5e" (UID: "1ff736d8-8719-402e-95c9-1d790c1dff5e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.464572 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1ff736d8-8719-402e-95c9-1d790c1dff5e-pod-info" (OuterVolumeSpecName: "pod-info") pod "1ff736d8-8719-402e-95c9-1d790c1dff5e" (UID: "1ff736d8-8719-402e-95c9-1d790c1dff5e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.477456 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff736d8-8719-402e-95c9-1d790c1dff5e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1ff736d8-8719-402e-95c9-1d790c1dff5e" (UID: "1ff736d8-8719-402e-95c9-1d790c1dff5e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.477891 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1ff736d8-8719-402e-95c9-1d790c1dff5e" (UID: "1ff736d8-8719-402e-95c9-1d790c1dff5e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.478028 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-kube-api-access-wbbsd" (OuterVolumeSpecName: "kube-api-access-wbbsd") pod "1ff736d8-8719-402e-95c9-1d790c1dff5e" (UID: "1ff736d8-8719-402e-95c9-1d790c1dff5e"). InnerVolumeSpecName "kube-api-access-wbbsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.491786 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "1ff736d8-8719-402e-95c9-1d790c1dff5e" (UID: "1ff736d8-8719-402e-95c9-1d790c1dff5e"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.511069 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-config-data" (OuterVolumeSpecName: "config-data") pod "1ff736d8-8719-402e-95c9-1d790c1dff5e" (UID: "1ff736d8-8719-402e-95c9-1d790c1dff5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.557272 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-server-conf" (OuterVolumeSpecName: "server-conf") pod "1ff736d8-8719-402e-95c9-1d790c1dff5e" (UID: "1ff736d8-8719-402e-95c9-1d790c1dff5e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.561222 4594 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.561291 4594 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ff736d8-8719-402e-95c9-1d790c1dff5e-pod-info\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.561304 4594 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-server-conf\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.561313 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.561325 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbbsd\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-kube-api-access-wbbsd\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.561334 4594 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ff736d8-8719-402e-95c9-1d790c1dff5e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.561402 4594 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.561452 4594 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ff736d8-8719-402e-95c9-1d790c1dff5e-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.561461 4594 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.561472 4594 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.598447 4594 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.602481 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1ff736d8-8719-402e-95c9-1d790c1dff5e" (UID: "1ff736d8-8719-402e-95c9-1d790c1dff5e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.665521 4594 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ff736d8-8719-402e-95c9-1d790c1dff5e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.665554 4594 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.984347 4594 generic.go:334] "Generic (PLEG): container finished" podID="47b6950d-0d97-486d-aaec-2a9eaaf74027" containerID="8d71481afa04edc87a81fb7329f95a03dd0d6110ec5d8956af3e7a934aa4fb1c" exitCode=0 Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.984466 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"47b6950d-0d97-486d-aaec-2a9eaaf74027","Type":"ContainerDied","Data":"8d71481afa04edc87a81fb7329f95a03dd0d6110ec5d8956af3e7a934aa4fb1c"} Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.987475 4594 generic.go:334] "Generic (PLEG): container finished" podID="1ff736d8-8719-402e-95c9-1d790c1dff5e" containerID="3a62ad72db403bb720849a88b9b2000a5a92e83e4cbb2f26b9b5510384634882" exitCode=0 Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.987510 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1ff736d8-8719-402e-95c9-1d790c1dff5e","Type":"ContainerDied","Data":"3a62ad72db403bb720849a88b9b2000a5a92e83e4cbb2f26b9b5510384634882"} Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.987539 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1ff736d8-8719-402e-95c9-1d790c1dff5e","Type":"ContainerDied","Data":"8ebccee727cb9f1612da3d8b110e643f95d4ae3119cfd567247515f73536dc30"} Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.987562 4594 scope.go:117] "RemoveContainer" containerID="3a62ad72db403bb720849a88b9b2000a5a92e83e4cbb2f26b9b5510384634882" Nov 29 05:48:19 crc kubenswrapper[4594]: I1129 05:48:19.987719 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.058608 4594 scope.go:117] "RemoveContainer" containerID="3d435468534c16e70f94ac0b2adfa267064e52b83844f59cf3f55c599965f5f9" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.068008 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.084119 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.106248 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ff736d8-8719-402e-95c9-1d790c1dff5e" path="/var/lib/kubelet/pods/1ff736d8-8719-402e-95c9-1d790c1dff5e/volumes" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.107858 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 05:48:20 crc kubenswrapper[4594]: E1129 05:48:20.108788 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff736d8-8719-402e-95c9-1d790c1dff5e" containerName="setup-container" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.108812 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff736d8-8719-402e-95c9-1d790c1dff5e" containerName="setup-container" Nov 29 05:48:20 crc kubenswrapper[4594]: E1129 05:48:20.108830 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff736d8-8719-402e-95c9-1d790c1dff5e" containerName="rabbitmq" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.108837 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff736d8-8719-402e-95c9-1d790c1dff5e" containerName="rabbitmq" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.109027 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff736d8-8719-402e-95c9-1d790c1dff5e" containerName="rabbitmq" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.110115 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.111901 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.113365 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.113547 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.113698 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.113807 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.113931 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.117203 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-sxw2s" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.124477 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.155420 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.198594 4594 scope.go:117] "RemoveContainer" containerID="3a62ad72db403bb720849a88b9b2000a5a92e83e4cbb2f26b9b5510384634882" Nov 29 05:48:20 crc kubenswrapper[4594]: E1129 05:48:20.199154 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a62ad72db403bb720849a88b9b2000a5a92e83e4cbb2f26b9b5510384634882\": container with ID starting with 3a62ad72db403bb720849a88b9b2000a5a92e83e4cbb2f26b9b5510384634882 not found: ID does not exist" containerID="3a62ad72db403bb720849a88b9b2000a5a92e83e4cbb2f26b9b5510384634882" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.199292 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a62ad72db403bb720849a88b9b2000a5a92e83e4cbb2f26b9b5510384634882"} err="failed to get container status \"3a62ad72db403bb720849a88b9b2000a5a92e83e4cbb2f26b9b5510384634882\": rpc error: code = NotFound desc = could not find container \"3a62ad72db403bb720849a88b9b2000a5a92e83e4cbb2f26b9b5510384634882\": container with ID starting with 3a62ad72db403bb720849a88b9b2000a5a92e83e4cbb2f26b9b5510384634882 not found: ID does not exist" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.199385 4594 scope.go:117] "RemoveContainer" containerID="3d435468534c16e70f94ac0b2adfa267064e52b83844f59cf3f55c599965f5f9" Nov 29 05:48:20 crc kubenswrapper[4594]: E1129 05:48:20.199820 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d435468534c16e70f94ac0b2adfa267064e52b83844f59cf3f55c599965f5f9\": container with ID starting with 3d435468534c16e70f94ac0b2adfa267064e52b83844f59cf3f55c599965f5f9 not found: ID does not exist" containerID="3d435468534c16e70f94ac0b2adfa267064e52b83844f59cf3f55c599965f5f9" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.199865 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d435468534c16e70f94ac0b2adfa267064e52b83844f59cf3f55c599965f5f9"} err="failed to get container status \"3d435468534c16e70f94ac0b2adfa267064e52b83844f59cf3f55c599965f5f9\": rpc error: code = NotFound desc = could not find container \"3d435468534c16e70f94ac0b2adfa267064e52b83844f59cf3f55c599965f5f9\": container with ID starting with 3d435468534c16e70f94ac0b2adfa267064e52b83844f59cf3f55c599965f5f9 not found: ID does not exist" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.281950 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-plugins-conf\") pod \"47b6950d-0d97-486d-aaec-2a9eaaf74027\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.282016 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-tls\") pod \"47b6950d-0d97-486d-aaec-2a9eaaf74027\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.282098 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-plugins\") pod \"47b6950d-0d97-486d-aaec-2a9eaaf74027\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.282213 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-confd\") pod \"47b6950d-0d97-486d-aaec-2a9eaaf74027\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.282247 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-server-conf\") pod \"47b6950d-0d97-486d-aaec-2a9eaaf74027\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.282333 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-erlang-cookie\") pod \"47b6950d-0d97-486d-aaec-2a9eaaf74027\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.282385 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/47b6950d-0d97-486d-aaec-2a9eaaf74027-erlang-cookie-secret\") pod \"47b6950d-0d97-486d-aaec-2a9eaaf74027\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.282485 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/47b6950d-0d97-486d-aaec-2a9eaaf74027-pod-info\") pod \"47b6950d-0d97-486d-aaec-2a9eaaf74027\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.282534 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ck52\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-kube-api-access-6ck52\") pod \"47b6950d-0d97-486d-aaec-2a9eaaf74027\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.282566 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"47b6950d-0d97-486d-aaec-2a9eaaf74027\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.282590 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-config-data\") pod \"47b6950d-0d97-486d-aaec-2a9eaaf74027\" (UID: \"47b6950d-0d97-486d-aaec-2a9eaaf74027\") " Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.282913 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.282989 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb17ce90-d0e2-4a46-905b-e27bff2295fb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.283014 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb17ce90-d0e2-4a46-905b-e27bff2295fb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.283103 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kftm8\" (UniqueName: \"kubernetes.io/projected/bb17ce90-d0e2-4a46-905b-e27bff2295fb-kube-api-access-kftm8\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.283129 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb17ce90-d0e2-4a46-905b-e27bff2295fb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.283159 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb17ce90-d0e2-4a46-905b-e27bff2295fb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.283208 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb17ce90-d0e2-4a46-905b-e27bff2295fb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.283245 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb17ce90-d0e2-4a46-905b-e27bff2295fb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.283284 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb17ce90-d0e2-4a46-905b-e27bff2295fb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.283309 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb17ce90-d0e2-4a46-905b-e27bff2295fb-config-data\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.283329 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb17ce90-d0e2-4a46-905b-e27bff2295fb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.285522 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "47b6950d-0d97-486d-aaec-2a9eaaf74027" (UID: "47b6950d-0d97-486d-aaec-2a9eaaf74027"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.286353 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "47b6950d-0d97-486d-aaec-2a9eaaf74027" (UID: "47b6950d-0d97-486d-aaec-2a9eaaf74027"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.286373 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "47b6950d-0d97-486d-aaec-2a9eaaf74027" (UID: "47b6950d-0d97-486d-aaec-2a9eaaf74027"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.288988 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/47b6950d-0d97-486d-aaec-2a9eaaf74027-pod-info" (OuterVolumeSpecName: "pod-info") pod "47b6950d-0d97-486d-aaec-2a9eaaf74027" (UID: "47b6950d-0d97-486d-aaec-2a9eaaf74027"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.291073 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "47b6950d-0d97-486d-aaec-2a9eaaf74027" (UID: "47b6950d-0d97-486d-aaec-2a9eaaf74027"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.291155 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "47b6950d-0d97-486d-aaec-2a9eaaf74027" (UID: "47b6950d-0d97-486d-aaec-2a9eaaf74027"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.291680 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b6950d-0d97-486d-aaec-2a9eaaf74027-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "47b6950d-0d97-486d-aaec-2a9eaaf74027" (UID: "47b6950d-0d97-486d-aaec-2a9eaaf74027"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.296192 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-kube-api-access-6ck52" (OuterVolumeSpecName: "kube-api-access-6ck52") pod "47b6950d-0d97-486d-aaec-2a9eaaf74027" (UID: "47b6950d-0d97-486d-aaec-2a9eaaf74027"). InnerVolumeSpecName "kube-api-access-6ck52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.313650 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-config-data" (OuterVolumeSpecName: "config-data") pod "47b6950d-0d97-486d-aaec-2a9eaaf74027" (UID: "47b6950d-0d97-486d-aaec-2a9eaaf74027"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.337512 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-server-conf" (OuterVolumeSpecName: "server-conf") pod "47b6950d-0d97-486d-aaec-2a9eaaf74027" (UID: "47b6950d-0d97-486d-aaec-2a9eaaf74027"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.385867 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb17ce90-d0e2-4a46-905b-e27bff2295fb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.385941 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb17ce90-d0e2-4a46-905b-e27bff2295fb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.385997 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb17ce90-d0e2-4a46-905b-e27bff2295fb-config-data\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.386034 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb17ce90-d0e2-4a46-905b-e27bff2295fb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.386132 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.386221 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb17ce90-d0e2-4a46-905b-e27bff2295fb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.386270 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb17ce90-d0e2-4a46-905b-e27bff2295fb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.386315 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb17ce90-d0e2-4a46-905b-e27bff2295fb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.386646 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.386709 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb17ce90-d0e2-4a46-905b-e27bff2295fb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.386852 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kftm8\" (UniqueName: \"kubernetes.io/projected/bb17ce90-d0e2-4a46-905b-e27bff2295fb-kube-api-access-kftm8\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.386910 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb17ce90-d0e2-4a46-905b-e27bff2295fb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387000 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb17ce90-d0e2-4a46-905b-e27bff2295fb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387064 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb17ce90-d0e2-4a46-905b-e27bff2295fb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387199 4594 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387217 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387228 4594 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387241 4594 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387268 4594 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387281 4594 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/47b6950d-0d97-486d-aaec-2a9eaaf74027-server-conf\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387294 4594 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387304 4594 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/47b6950d-0d97-486d-aaec-2a9eaaf74027-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387314 4594 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/47b6950d-0d97-486d-aaec-2a9eaaf74027-pod-info\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387323 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ck52\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-kube-api-access-6ck52\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387478 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb17ce90-d0e2-4a46-905b-e27bff2295fb-config-data\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.387896 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb17ce90-d0e2-4a46-905b-e27bff2295fb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.388387 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb17ce90-d0e2-4a46-905b-e27bff2295fb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.390969 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb17ce90-d0e2-4a46-905b-e27bff2295fb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.391296 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb17ce90-d0e2-4a46-905b-e27bff2295fb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.392445 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb17ce90-d0e2-4a46-905b-e27bff2295fb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.393036 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb17ce90-d0e2-4a46-905b-e27bff2295fb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.405313 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kftm8\" (UniqueName: \"kubernetes.io/projected/bb17ce90-d0e2-4a46-905b-e27bff2295fb-kube-api-access-kftm8\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.412678 4594 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.415070 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "47b6950d-0d97-486d-aaec-2a9eaaf74027" (UID: "47b6950d-0d97-486d-aaec-2a9eaaf74027"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.429440 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"bb17ce90-d0e2-4a46-905b-e27bff2295fb\") " pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.490165 4594 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.490212 4594 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/47b6950d-0d97-486d-aaec-2a9eaaf74027-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.509349 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.948448 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 05:48:20 crc kubenswrapper[4594]: I1129 05:48:20.996666 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bb17ce90-d0e2-4a46-905b-e27bff2295fb","Type":"ContainerStarted","Data":"def51e5406f74e258cc682ae77ed31218217f895fc29df2d321e43f0078f7dab"} Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.001378 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"47b6950d-0d97-486d-aaec-2a9eaaf74027","Type":"ContainerDied","Data":"f00322f1ba8cc7234e7076580dad5ececeb7e2af45de93f2a934554a112d1ef9"} Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.001421 4594 scope.go:117] "RemoveContainer" containerID="8d71481afa04edc87a81fb7329f95a03dd0d6110ec5d8956af3e7a934aa4fb1c" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.001533 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.086958 4594 scope.go:117] "RemoveContainer" containerID="18588c32fb9b6698f37b98a5ca9f655c67c03d703b3e049a536f58ced9b1e325" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.110356 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.123672 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.137610 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 05:48:21 crc kubenswrapper[4594]: E1129 05:48:21.138114 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b6950d-0d97-486d-aaec-2a9eaaf74027" containerName="rabbitmq" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.138137 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b6950d-0d97-486d-aaec-2a9eaaf74027" containerName="rabbitmq" Nov 29 05:48:21 crc kubenswrapper[4594]: E1129 05:48:21.138162 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b6950d-0d97-486d-aaec-2a9eaaf74027" containerName="setup-container" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.138170 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b6950d-0d97-486d-aaec-2a9eaaf74027" containerName="setup-container" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.138431 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b6950d-0d97-486d-aaec-2a9eaaf74027" containerName="rabbitmq" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.139611 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.141311 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nlhpq" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.141621 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.141646 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.141826 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.141967 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.141968 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.147284 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.149875 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.312197 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.312649 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3dcf9a3e-9869-4630-a695-c180db93aca7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.312726 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3dcf9a3e-9869-4630-a695-c180db93aca7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.312799 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3dcf9a3e-9869-4630-a695-c180db93aca7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.312856 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3dcf9a3e-9869-4630-a695-c180db93aca7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.312915 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3dcf9a3e-9869-4630-a695-c180db93aca7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.312936 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3dcf9a3e-9869-4630-a695-c180db93aca7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.312981 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkvj2\" (UniqueName: \"kubernetes.io/projected/3dcf9a3e-9869-4630-a695-c180db93aca7-kube-api-access-rkvj2\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.313040 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3dcf9a3e-9869-4630-a695-c180db93aca7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.313088 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3dcf9a3e-9869-4630-a695-c180db93aca7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.313151 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3dcf9a3e-9869-4630-a695-c180db93aca7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.415425 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3dcf9a3e-9869-4630-a695-c180db93aca7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.415480 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3dcf9a3e-9869-4630-a695-c180db93aca7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.415539 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3dcf9a3e-9869-4630-a695-c180db93aca7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.415607 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.415644 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3dcf9a3e-9869-4630-a695-c180db93aca7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.415690 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3dcf9a3e-9869-4630-a695-c180db93aca7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.415734 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3dcf9a3e-9869-4630-a695-c180db93aca7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.415764 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3dcf9a3e-9869-4630-a695-c180db93aca7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.415804 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3dcf9a3e-9869-4630-a695-c180db93aca7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.415829 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3dcf9a3e-9869-4630-a695-c180db93aca7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.415873 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkvj2\" (UniqueName: \"kubernetes.io/projected/3dcf9a3e-9869-4630-a695-c180db93aca7-kube-api-access-rkvj2\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.416802 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3dcf9a3e-9869-4630-a695-c180db93aca7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.417021 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3dcf9a3e-9869-4630-a695-c180db93aca7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.417112 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.417681 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3dcf9a3e-9869-4630-a695-c180db93aca7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.417992 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3dcf9a3e-9869-4630-a695-c180db93aca7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.418541 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3dcf9a3e-9869-4630-a695-c180db93aca7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.424073 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3dcf9a3e-9869-4630-a695-c180db93aca7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.424092 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3dcf9a3e-9869-4630-a695-c180db93aca7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.424398 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3dcf9a3e-9869-4630-a695-c180db93aca7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.426660 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3dcf9a3e-9869-4630-a695-c180db93aca7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.434521 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkvj2\" (UniqueName: \"kubernetes.io/projected/3dcf9a3e-9869-4630-a695-c180db93aca7-kube-api-access-rkvj2\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.452753 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dcf9a3e-9869-4630-a695-c180db93aca7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.478667 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:21 crc kubenswrapper[4594]: W1129 05:48:21.934849 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcf9a3e_9869_4630_a695_c180db93aca7.slice/crio-2fe845fff078ebaaa9c0811248c95652867e30a950742ea1f4e2ec2522b692ad WatchSource:0}: Error finding container 2fe845fff078ebaaa9c0811248c95652867e30a950742ea1f4e2ec2522b692ad: Status 404 returned error can't find the container with id 2fe845fff078ebaaa9c0811248c95652867e30a950742ea1f4e2ec2522b692ad Nov 29 05:48:21 crc kubenswrapper[4594]: I1129 05:48:21.956304 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 05:48:22 crc kubenswrapper[4594]: I1129 05:48:22.020122 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dcf9a3e-9869-4630-a695-c180db93aca7","Type":"ContainerStarted","Data":"2fe845fff078ebaaa9c0811248c95652867e30a950742ea1f4e2ec2522b692ad"} Nov 29 05:48:22 crc kubenswrapper[4594]: I1129 05:48:22.096783 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47b6950d-0d97-486d-aaec-2a9eaaf74027" path="/var/lib/kubelet/pods/47b6950d-0d97-486d-aaec-2a9eaaf74027/volumes" Nov 29 05:48:23 crc kubenswrapper[4594]: I1129 05:48:23.037278 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bb17ce90-d0e2-4a46-905b-e27bff2295fb","Type":"ContainerStarted","Data":"d78db6666b77f96b3f1243d6f817d1fc8043a645cf522a43215937c02dc0ea82"} Nov 29 05:48:24 crc kubenswrapper[4594]: I1129 05:48:24.050226 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dcf9a3e-9869-4630-a695-c180db93aca7","Type":"ContainerStarted","Data":"f10906791f5c36801dc26a2e566d61dc18c4f9dcfd06160a673dcc8734e46642"} Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.670582 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-jbvk5"] Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.681111 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.689626 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.721837 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-jbvk5"] Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.855664 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-dns-swift-storage-0\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.855738 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-config\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.855933 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-ovsdbserver-nb\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.855991 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-dns-svc\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.856044 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-openstack-edpm-ipam\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.856436 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-ovsdbserver-sb\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.856594 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmssx\" (UniqueName: \"kubernetes.io/projected/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-kube-api-access-hmssx\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.960421 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-dns-swift-storage-0\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.960496 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-config\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.960612 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-ovsdbserver-nb\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.960661 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-dns-svc\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.960698 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-openstack-edpm-ipam\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.960878 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-ovsdbserver-sb\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.961011 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmssx\" (UniqueName: \"kubernetes.io/projected/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-kube-api-access-hmssx\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.961934 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-dns-swift-storage-0\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.961970 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-ovsdbserver-nb\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.961984 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-config\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.962093 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-ovsdbserver-sb\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.962122 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-openstack-edpm-ipam\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.962192 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-dns-svc\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:26 crc kubenswrapper[4594]: I1129 05:48:26.981684 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmssx\" (UniqueName: \"kubernetes.io/projected/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-kube-api-access-hmssx\") pod \"dnsmasq-dns-bf6c7df67-jbvk5\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:27 crc kubenswrapper[4594]: I1129 05:48:27.007517 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:27 crc kubenswrapper[4594]: I1129 05:48:27.442637 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-jbvk5"] Nov 29 05:48:28 crc kubenswrapper[4594]: I1129 05:48:28.090955 4594 generic.go:334] "Generic (PLEG): container finished" podID="deb33222-9014-4cd7-aa7f-ac2b4be21d6c" containerID="e9a31cd2c3ce3ccfadccf5e49310a157b5ef36e830fae52db96453876817fc2b" exitCode=0 Nov 29 05:48:28 crc kubenswrapper[4594]: I1129 05:48:28.093700 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" event={"ID":"deb33222-9014-4cd7-aa7f-ac2b4be21d6c","Type":"ContainerDied","Data":"e9a31cd2c3ce3ccfadccf5e49310a157b5ef36e830fae52db96453876817fc2b"} Nov 29 05:48:28 crc kubenswrapper[4594]: I1129 05:48:28.093729 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" event={"ID":"deb33222-9014-4cd7-aa7f-ac2b4be21d6c","Type":"ContainerStarted","Data":"8880f3ba0f800b9a24e19361b3b100d39a91d7cd1465bb6993b985bf58ae4137"} Nov 29 05:48:29 crc kubenswrapper[4594]: I1129 05:48:29.102589 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" event={"ID":"deb33222-9014-4cd7-aa7f-ac2b4be21d6c","Type":"ContainerStarted","Data":"338ed5f76192ad49a0234591f306f7fb99be238fbf32666ef580d1e939bc7b82"} Nov 29 05:48:29 crc kubenswrapper[4594]: I1129 05:48:29.103376 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:29 crc kubenswrapper[4594]: I1129 05:48:29.126379 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" podStartSLOduration=3.126355234 podStartE2EDuration="3.126355234s" podCreationTimestamp="2025-11-29 05:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:48:29.118294824 +0000 UTC m=+1233.358804044" watchObservedRunningTime="2025-11-29 05:48:29.126355234 +0000 UTC m=+1233.366864454" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.010409 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.103227 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-sp4l2"] Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.103497 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" podUID="0ad2733e-97bd-4b40-87e3-cd5b16a8479e" containerName="dnsmasq-dns" containerID="cri-o://a2df68ae0304176b1714506c9cb937a93d6aac56aa0ca3336147889e2aae241a" gracePeriod=10 Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.255784 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77b58f4b85-bgkdt"] Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.262581 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.273104 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77b58f4b85-bgkdt"] Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.405386 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-ovsdbserver-nb\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.405718 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-openstack-edpm-ipam\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.405756 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-config\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.405832 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-dns-swift-storage-0\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.405900 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ncng\" (UniqueName: \"kubernetes.io/projected/23dcced9-156e-4d68-82c3-43b9b2a0d9be-kube-api-access-4ncng\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.405961 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-dns-svc\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.406088 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-ovsdbserver-sb\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.511516 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-ovsdbserver-nb\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.511614 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-openstack-edpm-ipam\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.511683 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-config\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.511834 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-dns-swift-storage-0\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.511956 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ncng\" (UniqueName: \"kubernetes.io/projected/23dcced9-156e-4d68-82c3-43b9b2a0d9be-kube-api-access-4ncng\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.512075 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-dns-svc\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.512349 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-ovsdbserver-sb\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.512486 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-ovsdbserver-nb\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.512751 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-config\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.513285 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-openstack-edpm-ipam\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.513386 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-dns-svc\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.513905 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-ovsdbserver-sb\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.513948 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23dcced9-156e-4d68-82c3-43b9b2a0d9be-dns-swift-storage-0\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.533071 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ncng\" (UniqueName: \"kubernetes.io/projected/23dcced9-156e-4d68-82c3-43b9b2a0d9be-kube-api-access-4ncng\") pod \"dnsmasq-dns-77b58f4b85-bgkdt\" (UID: \"23dcced9-156e-4d68-82c3-43b9b2a0d9be\") " pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.585449 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.599859 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.716474 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-dns-svc\") pod \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.716765 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-config\") pod \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.716847 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-ovsdbserver-sb\") pod \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.717004 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-ovsdbserver-nb\") pod \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.717058 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-dns-swift-storage-0\") pod \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.717094 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bmpc\" (UniqueName: \"kubernetes.io/projected/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-kube-api-access-8bmpc\") pod \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\" (UID: \"0ad2733e-97bd-4b40-87e3-cd5b16a8479e\") " Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.725488 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-kube-api-access-8bmpc" (OuterVolumeSpecName: "kube-api-access-8bmpc") pod "0ad2733e-97bd-4b40-87e3-cd5b16a8479e" (UID: "0ad2733e-97bd-4b40-87e3-cd5b16a8479e"). InnerVolumeSpecName "kube-api-access-8bmpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.768533 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0ad2733e-97bd-4b40-87e3-cd5b16a8479e" (UID: "0ad2733e-97bd-4b40-87e3-cd5b16a8479e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.771871 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ad2733e-97bd-4b40-87e3-cd5b16a8479e" (UID: "0ad2733e-97bd-4b40-87e3-cd5b16a8479e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.774493 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ad2733e-97bd-4b40-87e3-cd5b16a8479e" (UID: "0ad2733e-97bd-4b40-87e3-cd5b16a8479e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.801578 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-config" (OuterVolumeSpecName: "config") pod "0ad2733e-97bd-4b40-87e3-cd5b16a8479e" (UID: "0ad2733e-97bd-4b40-87e3-cd5b16a8479e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.803763 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ad2733e-97bd-4b40-87e3-cd5b16a8479e" (UID: "0ad2733e-97bd-4b40-87e3-cd5b16a8479e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.824862 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.824884 4594 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.824896 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bmpc\" (UniqueName: \"kubernetes.io/projected/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-kube-api-access-8bmpc\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.824908 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.824920 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:37 crc kubenswrapper[4594]: I1129 05:48:37.824928 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ad2733e-97bd-4b40-87e3-cd5b16a8479e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.032504 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77b58f4b85-bgkdt"] Nov 29 05:48:38 crc kubenswrapper[4594]: W1129 05:48:38.036764 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23dcced9_156e_4d68_82c3_43b9b2a0d9be.slice/crio-e250e75bc1cafabfa8a6686a30cc188f61f5e71e9ae3b70da3f94a9cec8379d9 WatchSource:0}: Error finding container e250e75bc1cafabfa8a6686a30cc188f61f5e71e9ae3b70da3f94a9cec8379d9: Status 404 returned error can't find the container with id e250e75bc1cafabfa8a6686a30cc188f61f5e71e9ae3b70da3f94a9cec8379d9 Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.210938 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" event={"ID":"23dcced9-156e-4d68-82c3-43b9b2a0d9be","Type":"ContainerStarted","Data":"e250e75bc1cafabfa8a6686a30cc188f61f5e71e9ae3b70da3f94a9cec8379d9"} Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.214605 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.215325 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" event={"ID":"0ad2733e-97bd-4b40-87e3-cd5b16a8479e","Type":"ContainerDied","Data":"a2df68ae0304176b1714506c9cb937a93d6aac56aa0ca3336147889e2aae241a"} Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.215415 4594 scope.go:117] "RemoveContainer" containerID="a2df68ae0304176b1714506c9cb937a93d6aac56aa0ca3336147889e2aae241a" Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.214337 4594 generic.go:334] "Generic (PLEG): container finished" podID="0ad2733e-97bd-4b40-87e3-cd5b16a8479e" containerID="a2df68ae0304176b1714506c9cb937a93d6aac56aa0ca3336147889e2aae241a" exitCode=0 Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.216042 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-sp4l2" event={"ID":"0ad2733e-97bd-4b40-87e3-cd5b16a8479e","Type":"ContainerDied","Data":"d5ef8ba38824cbdb5e58ac8c0fba1f50f1a893a56b9d3d7bd7129a858b752a21"} Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.355621 4594 scope.go:117] "RemoveContainer" containerID="5bd374b8d188d0e28ccb8782a8fa16b261c7e1ef10aef3406971f60dd6dedfab" Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.359941 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-sp4l2"] Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.370721 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-sp4l2"] Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.387575 4594 scope.go:117] "RemoveContainer" containerID="a2df68ae0304176b1714506c9cb937a93d6aac56aa0ca3336147889e2aae241a" Nov 29 05:48:38 crc kubenswrapper[4594]: E1129 05:48:38.388030 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2df68ae0304176b1714506c9cb937a93d6aac56aa0ca3336147889e2aae241a\": container with ID starting with a2df68ae0304176b1714506c9cb937a93d6aac56aa0ca3336147889e2aae241a not found: ID does not exist" containerID="a2df68ae0304176b1714506c9cb937a93d6aac56aa0ca3336147889e2aae241a" Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.388067 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2df68ae0304176b1714506c9cb937a93d6aac56aa0ca3336147889e2aae241a"} err="failed to get container status \"a2df68ae0304176b1714506c9cb937a93d6aac56aa0ca3336147889e2aae241a\": rpc error: code = NotFound desc = could not find container \"a2df68ae0304176b1714506c9cb937a93d6aac56aa0ca3336147889e2aae241a\": container with ID starting with a2df68ae0304176b1714506c9cb937a93d6aac56aa0ca3336147889e2aae241a not found: ID does not exist" Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.388090 4594 scope.go:117] "RemoveContainer" containerID="5bd374b8d188d0e28ccb8782a8fa16b261c7e1ef10aef3406971f60dd6dedfab" Nov 29 05:48:38 crc kubenswrapper[4594]: E1129 05:48:38.388397 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bd374b8d188d0e28ccb8782a8fa16b261c7e1ef10aef3406971f60dd6dedfab\": container with ID starting with 5bd374b8d188d0e28ccb8782a8fa16b261c7e1ef10aef3406971f60dd6dedfab not found: ID does not exist" containerID="5bd374b8d188d0e28ccb8782a8fa16b261c7e1ef10aef3406971f60dd6dedfab" Nov 29 05:48:38 crc kubenswrapper[4594]: I1129 05:48:38.388424 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd374b8d188d0e28ccb8782a8fa16b261c7e1ef10aef3406971f60dd6dedfab"} err="failed to get container status \"5bd374b8d188d0e28ccb8782a8fa16b261c7e1ef10aef3406971f60dd6dedfab\": rpc error: code = NotFound desc = could not find container \"5bd374b8d188d0e28ccb8782a8fa16b261c7e1ef10aef3406971f60dd6dedfab\": container with ID starting with 5bd374b8d188d0e28ccb8782a8fa16b261c7e1ef10aef3406971f60dd6dedfab not found: ID does not exist" Nov 29 05:48:39 crc kubenswrapper[4594]: I1129 05:48:39.227728 4594 generic.go:334] "Generic (PLEG): container finished" podID="23dcced9-156e-4d68-82c3-43b9b2a0d9be" containerID="42aa2efb2dcb4258a86cd2ee796ebfc10b0d5d37d63e95fff31114d4d1e7d9a9" exitCode=0 Nov 29 05:48:39 crc kubenswrapper[4594]: I1129 05:48:39.227827 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" event={"ID":"23dcced9-156e-4d68-82c3-43b9b2a0d9be","Type":"ContainerDied","Data":"42aa2efb2dcb4258a86cd2ee796ebfc10b0d5d37d63e95fff31114d4d1e7d9a9"} Nov 29 05:48:40 crc kubenswrapper[4594]: I1129 05:48:40.095096 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ad2733e-97bd-4b40-87e3-cd5b16a8479e" path="/var/lib/kubelet/pods/0ad2733e-97bd-4b40-87e3-cd5b16a8479e/volumes" Nov 29 05:48:40 crc kubenswrapper[4594]: I1129 05:48:40.247757 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" event={"ID":"23dcced9-156e-4d68-82c3-43b9b2a0d9be","Type":"ContainerStarted","Data":"8503fc2e0c8f148fb2821ea52bee773a44162db2b3ac5a14061eea182141f334"} Nov 29 05:48:40 crc kubenswrapper[4594]: I1129 05:48:40.248037 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:40 crc kubenswrapper[4594]: I1129 05:48:40.276057 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" podStartSLOduration=3.276026435 podStartE2EDuration="3.276026435s" podCreationTimestamp="2025-11-29 05:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:48:40.265220493 +0000 UTC m=+1244.505729713" watchObservedRunningTime="2025-11-29 05:48:40.276026435 +0000 UTC m=+1244.516535655" Nov 29 05:48:45 crc kubenswrapper[4594]: I1129 05:48:45.800582 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:48:45 crc kubenswrapper[4594]: I1129 05:48:45.801435 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:48:47 crc kubenswrapper[4594]: I1129 05:48:47.587436 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77b58f4b85-bgkdt" Nov 29 05:48:47 crc kubenswrapper[4594]: I1129 05:48:47.641044 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-jbvk5"] Nov 29 05:48:47 crc kubenswrapper[4594]: I1129 05:48:47.641290 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" podUID="deb33222-9014-4cd7-aa7f-ac2b4be21d6c" containerName="dnsmasq-dns" containerID="cri-o://338ed5f76192ad49a0234591f306f7fb99be238fbf32666ef580d1e939bc7b82" gracePeriod=10 Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.154706 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.174417 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmssx\" (UniqueName: \"kubernetes.io/projected/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-kube-api-access-hmssx\") pod \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.174535 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-config\") pod \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.182902 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-kube-api-access-hmssx" (OuterVolumeSpecName: "kube-api-access-hmssx") pod "deb33222-9014-4cd7-aa7f-ac2b4be21d6c" (UID: "deb33222-9014-4cd7-aa7f-ac2b4be21d6c"). InnerVolumeSpecName "kube-api-access-hmssx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.240059 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-config" (OuterVolumeSpecName: "config") pod "deb33222-9014-4cd7-aa7f-ac2b4be21d6c" (UID: "deb33222-9014-4cd7-aa7f-ac2b4be21d6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.276129 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-openstack-edpm-ipam\") pod \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.276208 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-ovsdbserver-nb\") pod \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.276287 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-ovsdbserver-sb\") pod \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.276399 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-dns-svc\") pod \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.276501 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-dns-swift-storage-0\") pod \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\" (UID: \"deb33222-9014-4cd7-aa7f-ac2b4be21d6c\") " Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.277223 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmssx\" (UniqueName: \"kubernetes.io/projected/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-kube-api-access-hmssx\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.277240 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-config\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.362165 4594 generic.go:334] "Generic (PLEG): container finished" podID="deb33222-9014-4cd7-aa7f-ac2b4be21d6c" containerID="338ed5f76192ad49a0234591f306f7fb99be238fbf32666ef580d1e939bc7b82" exitCode=0 Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.362219 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" event={"ID":"deb33222-9014-4cd7-aa7f-ac2b4be21d6c","Type":"ContainerDied","Data":"338ed5f76192ad49a0234591f306f7fb99be238fbf32666ef580d1e939bc7b82"} Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.362267 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" event={"ID":"deb33222-9014-4cd7-aa7f-ac2b4be21d6c","Type":"ContainerDied","Data":"8880f3ba0f800b9a24e19361b3b100d39a91d7cd1465bb6993b985bf58ae4137"} Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.362294 4594 scope.go:117] "RemoveContainer" containerID="338ed5f76192ad49a0234591f306f7fb99be238fbf32666ef580d1e939bc7b82" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.362387 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf6c7df67-jbvk5" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.369747 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "deb33222-9014-4cd7-aa7f-ac2b4be21d6c" (UID: "deb33222-9014-4cd7-aa7f-ac2b4be21d6c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.379118 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "deb33222-9014-4cd7-aa7f-ac2b4be21d6c" (UID: "deb33222-9014-4cd7-aa7f-ac2b4be21d6c"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.391486 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.395830 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "deb33222-9014-4cd7-aa7f-ac2b4be21d6c" (UID: "deb33222-9014-4cd7-aa7f-ac2b4be21d6c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.409735 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "deb33222-9014-4cd7-aa7f-ac2b4be21d6c" (UID: "deb33222-9014-4cd7-aa7f-ac2b4be21d6c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.411484 4594 scope.go:117] "RemoveContainer" containerID="e9a31cd2c3ce3ccfadccf5e49310a157b5ef36e830fae52db96453876817fc2b" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.419293 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "deb33222-9014-4cd7-aa7f-ac2b4be21d6c" (UID: "deb33222-9014-4cd7-aa7f-ac2b4be21d6c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.444356 4594 scope.go:117] "RemoveContainer" containerID="338ed5f76192ad49a0234591f306f7fb99be238fbf32666ef580d1e939bc7b82" Nov 29 05:48:48 crc kubenswrapper[4594]: E1129 05:48:48.446443 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"338ed5f76192ad49a0234591f306f7fb99be238fbf32666ef580d1e939bc7b82\": container with ID starting with 338ed5f76192ad49a0234591f306f7fb99be238fbf32666ef580d1e939bc7b82 not found: ID does not exist" containerID="338ed5f76192ad49a0234591f306f7fb99be238fbf32666ef580d1e939bc7b82" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.446480 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338ed5f76192ad49a0234591f306f7fb99be238fbf32666ef580d1e939bc7b82"} err="failed to get container status \"338ed5f76192ad49a0234591f306f7fb99be238fbf32666ef580d1e939bc7b82\": rpc error: code = NotFound desc = could not find container \"338ed5f76192ad49a0234591f306f7fb99be238fbf32666ef580d1e939bc7b82\": container with ID starting with 338ed5f76192ad49a0234591f306f7fb99be238fbf32666ef580d1e939bc7b82 not found: ID does not exist" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.446502 4594 scope.go:117] "RemoveContainer" containerID="e9a31cd2c3ce3ccfadccf5e49310a157b5ef36e830fae52db96453876817fc2b" Nov 29 05:48:48 crc kubenswrapper[4594]: E1129 05:48:48.447345 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a31cd2c3ce3ccfadccf5e49310a157b5ef36e830fae52db96453876817fc2b\": container with ID starting with e9a31cd2c3ce3ccfadccf5e49310a157b5ef36e830fae52db96453876817fc2b not found: ID does not exist" containerID="e9a31cd2c3ce3ccfadccf5e49310a157b5ef36e830fae52db96453876817fc2b" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.447370 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a31cd2c3ce3ccfadccf5e49310a157b5ef36e830fae52db96453876817fc2b"} err="failed to get container status \"e9a31cd2c3ce3ccfadccf5e49310a157b5ef36e830fae52db96453876817fc2b\": rpc error: code = NotFound desc = could not find container \"e9a31cd2c3ce3ccfadccf5e49310a157b5ef36e830fae52db96453876817fc2b\": container with ID starting with e9a31cd2c3ce3ccfadccf5e49310a157b5ef36e830fae52db96453876817fc2b not found: ID does not exist" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.494029 4594 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.494061 4594 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.494074 4594 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.494084 4594 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deb33222-9014-4cd7-aa7f-ac2b4be21d6c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.696758 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-jbvk5"] Nov 29 05:48:48 crc kubenswrapper[4594]: I1129 05:48:48.704869 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-jbvk5"] Nov 29 05:48:50 crc kubenswrapper[4594]: I1129 05:48:50.105084 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb33222-9014-4cd7-aa7f-ac2b4be21d6c" path="/var/lib/kubelet/pods/deb33222-9014-4cd7-aa7f-ac2b4be21d6c/volumes" Nov 29 05:48:54 crc kubenswrapper[4594]: I1129 05:48:54.436005 4594 generic.go:334] "Generic (PLEG): container finished" podID="bb17ce90-d0e2-4a46-905b-e27bff2295fb" containerID="d78db6666b77f96b3f1243d6f817d1fc8043a645cf522a43215937c02dc0ea82" exitCode=0 Nov 29 05:48:54 crc kubenswrapper[4594]: I1129 05:48:54.436100 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bb17ce90-d0e2-4a46-905b-e27bff2295fb","Type":"ContainerDied","Data":"d78db6666b77f96b3f1243d6f817d1fc8043a645cf522a43215937c02dc0ea82"} Nov 29 05:48:55 crc kubenswrapper[4594]: I1129 05:48:55.445104 4594 generic.go:334] "Generic (PLEG): container finished" podID="3dcf9a3e-9869-4630-a695-c180db93aca7" containerID="f10906791f5c36801dc26a2e566d61dc18c4f9dcfd06160a673dcc8734e46642" exitCode=0 Nov 29 05:48:55 crc kubenswrapper[4594]: I1129 05:48:55.445174 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dcf9a3e-9869-4630-a695-c180db93aca7","Type":"ContainerDied","Data":"f10906791f5c36801dc26a2e566d61dc18c4f9dcfd06160a673dcc8734e46642"} Nov 29 05:48:55 crc kubenswrapper[4594]: I1129 05:48:55.448745 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bb17ce90-d0e2-4a46-905b-e27bff2295fb","Type":"ContainerStarted","Data":"fa9bc0f0f8fd667940801dea9afbdfe569aca4859fceef94943ad6502c7d0726"} Nov 29 05:48:55 crc kubenswrapper[4594]: I1129 05:48:55.449072 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 29 05:48:55 crc kubenswrapper[4594]: E1129 05:48:55.454187 4594 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcf9a3e_9869_4630_a695_c180db93aca7.slice/crio-f10906791f5c36801dc26a2e566d61dc18c4f9dcfd06160a673dcc8734e46642.scope\": RecentStats: unable to find data in memory cache]" Nov 29 05:48:55 crc kubenswrapper[4594]: I1129 05:48:55.497466 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.497447671 podStartE2EDuration="35.497447671s" podCreationTimestamp="2025-11-29 05:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:48:55.490162279 +0000 UTC m=+1259.730671499" watchObservedRunningTime="2025-11-29 05:48:55.497447671 +0000 UTC m=+1259.737956891" Nov 29 05:48:56 crc kubenswrapper[4594]: I1129 05:48:56.461439 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dcf9a3e-9869-4630-a695-c180db93aca7","Type":"ContainerStarted","Data":"28813032938a820428e378fbe4fa7c317c35656937e2bfcc7282c4f83d738fc7"} Nov 29 05:48:56 crc kubenswrapper[4594]: I1129 05:48:56.462315 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:48:56 crc kubenswrapper[4594]: I1129 05:48:56.493041 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.493017102 podStartE2EDuration="35.493017102s" podCreationTimestamp="2025-11-29 05:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 05:48:56.484094992 +0000 UTC m=+1260.724604212" watchObservedRunningTime="2025-11-29 05:48:56.493017102 +0000 UTC m=+1260.733526322" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.733784 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb"] Nov 29 05:49:05 crc kubenswrapper[4594]: E1129 05:49:05.734964 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb33222-9014-4cd7-aa7f-ac2b4be21d6c" containerName="dnsmasq-dns" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.734982 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb33222-9014-4cd7-aa7f-ac2b4be21d6c" containerName="dnsmasq-dns" Nov 29 05:49:05 crc kubenswrapper[4594]: E1129 05:49:05.735003 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad2733e-97bd-4b40-87e3-cd5b16a8479e" containerName="init" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.735010 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad2733e-97bd-4b40-87e3-cd5b16a8479e" containerName="init" Nov 29 05:49:05 crc kubenswrapper[4594]: E1129 05:49:05.735045 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad2733e-97bd-4b40-87e3-cd5b16a8479e" containerName="dnsmasq-dns" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.735053 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad2733e-97bd-4b40-87e3-cd5b16a8479e" containerName="dnsmasq-dns" Nov 29 05:49:05 crc kubenswrapper[4594]: E1129 05:49:05.735073 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb33222-9014-4cd7-aa7f-ac2b4be21d6c" containerName="init" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.735078 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb33222-9014-4cd7-aa7f-ac2b4be21d6c" containerName="init" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.735350 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb33222-9014-4cd7-aa7f-ac2b4be21d6c" containerName="dnsmasq-dns" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.735392 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ad2733e-97bd-4b40-87e3-cd5b16a8479e" containerName="dnsmasq-dns" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.736335 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.738248 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.738645 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.738727 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.738937 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.745125 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb"] Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.837461 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.837707 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-547jd\" (UniqueName: \"kubernetes.io/projected/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-kube-api-access-547jd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.837929 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.838082 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.939905 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.940038 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.940089 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-547jd\" (UniqueName: \"kubernetes.io/projected/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-kube-api-access-547jd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.940143 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.946341 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.946550 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.946918 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:05 crc kubenswrapper[4594]: I1129 05:49:05.956516 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-547jd\" (UniqueName: \"kubernetes.io/projected/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-kube-api-access-547jd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:06 crc kubenswrapper[4594]: I1129 05:49:06.059475 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:06 crc kubenswrapper[4594]: I1129 05:49:06.592792 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb"] Nov 29 05:49:06 crc kubenswrapper[4594]: W1129 05:49:06.595936 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb16b2100_7eea_43dd_8b1c_f2c337bdb3bd.slice/crio-725b190f73af9b825045ea36f482e969efb7718042bc06302241f5c20cf5dd18 WatchSource:0}: Error finding container 725b190f73af9b825045ea36f482e969efb7718042bc06302241f5c20cf5dd18: Status 404 returned error can't find the container with id 725b190f73af9b825045ea36f482e969efb7718042bc06302241f5c20cf5dd18 Nov 29 05:49:07 crc kubenswrapper[4594]: I1129 05:49:07.587333 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" event={"ID":"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd","Type":"ContainerStarted","Data":"725b190f73af9b825045ea36f482e969efb7718042bc06302241f5c20cf5dd18"} Nov 29 05:49:07 crc kubenswrapper[4594]: I1129 05:49:07.798226 4594 scope.go:117] "RemoveContainer" containerID="1400cc0048042611c5f01b98608de26c8f1984ca87d5bbb19a72a0e7e63c594d" Nov 29 05:49:10 crc kubenswrapper[4594]: I1129 05:49:10.515518 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 29 05:49:11 crc kubenswrapper[4594]: I1129 05:49:11.485322 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 29 05:49:15 crc kubenswrapper[4594]: I1129 05:49:15.802233 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:49:15 crc kubenswrapper[4594]: I1129 05:49:15.802857 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:49:15 crc kubenswrapper[4594]: I1129 05:49:15.802944 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:49:15 crc kubenswrapper[4594]: I1129 05:49:15.803893 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8eaecd3b285f1b345905df03e8392cda698400ba526ee824406a745a137450c8"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 05:49:15 crc kubenswrapper[4594]: I1129 05:49:15.803953 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://8eaecd3b285f1b345905df03e8392cda698400ba526ee824406a745a137450c8" gracePeriod=600 Nov 29 05:49:16 crc kubenswrapper[4594]: I1129 05:49:16.695644 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="8eaecd3b285f1b345905df03e8392cda698400ba526ee824406a745a137450c8" exitCode=0 Nov 29 05:49:16 crc kubenswrapper[4594]: I1129 05:49:16.695739 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"8eaecd3b285f1b345905df03e8392cda698400ba526ee824406a745a137450c8"} Nov 29 05:49:16 crc kubenswrapper[4594]: I1129 05:49:16.696108 4594 scope.go:117] "RemoveContainer" containerID="f65325fd45322d2eca6d68603b2fa4746238184f283f293edabcb7b19e64c595" Nov 29 05:49:20 crc kubenswrapper[4594]: I1129 05:49:20.752168 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"e872a7864afdb97a4545c838292279114599a596d9e2444041e74943a73c1a43"} Nov 29 05:49:20 crc kubenswrapper[4594]: I1129 05:49:20.755215 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" event={"ID":"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd","Type":"ContainerStarted","Data":"f34ad0abd6a3745f9175b4e28f3a3a4e66ee48bd183ef7a9083b17f355ade30f"} Nov 29 05:49:20 crc kubenswrapper[4594]: I1129 05:49:20.791022 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" podStartSLOduration=2.336355681 podStartE2EDuration="15.79099969s" podCreationTimestamp="2025-11-29 05:49:05 +0000 UTC" firstStartedPulling="2025-11-29 05:49:06.598599551 +0000 UTC m=+1270.839108771" lastFinishedPulling="2025-11-29 05:49:20.053243559 +0000 UTC m=+1284.293752780" observedRunningTime="2025-11-29 05:49:20.778687006 +0000 UTC m=+1285.019196226" watchObservedRunningTime="2025-11-29 05:49:20.79099969 +0000 UTC m=+1285.031508911" Nov 29 05:49:31 crc kubenswrapper[4594]: I1129 05:49:31.871036 4594 generic.go:334] "Generic (PLEG): container finished" podID="b16b2100-7eea-43dd-8b1c-f2c337bdb3bd" containerID="f34ad0abd6a3745f9175b4e28f3a3a4e66ee48bd183ef7a9083b17f355ade30f" exitCode=0 Nov 29 05:49:31 crc kubenswrapper[4594]: I1129 05:49:31.871121 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" event={"ID":"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd","Type":"ContainerDied","Data":"f34ad0abd6a3745f9175b4e28f3a3a4e66ee48bd183ef7a9083b17f355ade30f"} Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.266938 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.377199 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-repo-setup-combined-ca-bundle\") pod \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.377425 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-inventory\") pod \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.377600 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-ssh-key\") pod \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.378287 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-547jd\" (UniqueName: \"kubernetes.io/projected/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-kube-api-access-547jd\") pod \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\" (UID: \"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd\") " Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.383340 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-kube-api-access-547jd" (OuterVolumeSpecName: "kube-api-access-547jd") pod "b16b2100-7eea-43dd-8b1c-f2c337bdb3bd" (UID: "b16b2100-7eea-43dd-8b1c-f2c337bdb3bd"). InnerVolumeSpecName "kube-api-access-547jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.384499 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "b16b2100-7eea-43dd-8b1c-f2c337bdb3bd" (UID: "b16b2100-7eea-43dd-8b1c-f2c337bdb3bd"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.405677 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-inventory" (OuterVolumeSpecName: "inventory") pod "b16b2100-7eea-43dd-8b1c-f2c337bdb3bd" (UID: "b16b2100-7eea-43dd-8b1c-f2c337bdb3bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.406884 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b16b2100-7eea-43dd-8b1c-f2c337bdb3bd" (UID: "b16b2100-7eea-43dd-8b1c-f2c337bdb3bd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.481469 4594 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.481498 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.481511 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.481520 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-547jd\" (UniqueName: \"kubernetes.io/projected/b16b2100-7eea-43dd-8b1c-f2c337bdb3bd-kube-api-access-547jd\") on node \"crc\" DevicePath \"\"" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.893212 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" event={"ID":"b16b2100-7eea-43dd-8b1c-f2c337bdb3bd","Type":"ContainerDied","Data":"725b190f73af9b825045ea36f482e969efb7718042bc06302241f5c20cf5dd18"} Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.893502 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="725b190f73af9b825045ea36f482e969efb7718042bc06302241f5c20cf5dd18" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.893293 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.967408 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz"] Nov 29 05:49:33 crc kubenswrapper[4594]: E1129 05:49:33.967946 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16b2100-7eea-43dd-8b1c-f2c337bdb3bd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.967966 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16b2100-7eea-43dd-8b1c-f2c337bdb3bd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.968187 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="b16b2100-7eea-43dd-8b1c-f2c337bdb3bd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.970081 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.972599 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.972917 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.973063 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.973635 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:49:33 crc kubenswrapper[4594]: I1129 05:49:33.982853 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz"] Nov 29 05:49:34 crc kubenswrapper[4594]: I1129 05:49:34.094549 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqn7g\" (UniqueName: \"kubernetes.io/projected/db8e580b-8fbe-4e91-bb94-023bf1b2903b-kube-api-access-qqn7g\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cxvz\" (UID: \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" Nov 29 05:49:34 crc kubenswrapper[4594]: I1129 05:49:34.095084 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db8e580b-8fbe-4e91-bb94-023bf1b2903b-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cxvz\" (UID: \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" Nov 29 05:49:34 crc kubenswrapper[4594]: I1129 05:49:34.095181 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db8e580b-8fbe-4e91-bb94-023bf1b2903b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cxvz\" (UID: \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" Nov 29 05:49:34 crc kubenswrapper[4594]: I1129 05:49:34.197610 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db8e580b-8fbe-4e91-bb94-023bf1b2903b-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cxvz\" (UID: \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" Nov 29 05:49:34 crc kubenswrapper[4594]: I1129 05:49:34.197739 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db8e580b-8fbe-4e91-bb94-023bf1b2903b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cxvz\" (UID: \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" Nov 29 05:49:34 crc kubenswrapper[4594]: I1129 05:49:34.198062 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqn7g\" (UniqueName: \"kubernetes.io/projected/db8e580b-8fbe-4e91-bb94-023bf1b2903b-kube-api-access-qqn7g\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cxvz\" (UID: \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" Nov 29 05:49:34 crc kubenswrapper[4594]: I1129 05:49:34.203149 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db8e580b-8fbe-4e91-bb94-023bf1b2903b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cxvz\" (UID: \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" Nov 29 05:49:34 crc kubenswrapper[4594]: I1129 05:49:34.203552 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db8e580b-8fbe-4e91-bb94-023bf1b2903b-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cxvz\" (UID: \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" Nov 29 05:49:34 crc kubenswrapper[4594]: I1129 05:49:34.213070 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqn7g\" (UniqueName: \"kubernetes.io/projected/db8e580b-8fbe-4e91-bb94-023bf1b2903b-kube-api-access-qqn7g\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cxvz\" (UID: \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" Nov 29 05:49:34 crc kubenswrapper[4594]: I1129 05:49:34.291397 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" Nov 29 05:49:34 crc kubenswrapper[4594]: I1129 05:49:34.780115 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz"] Nov 29 05:49:34 crc kubenswrapper[4594]: W1129 05:49:34.782761 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb8e580b_8fbe_4e91_bb94_023bf1b2903b.slice/crio-98229ed276f49eb4d4d64f84d4b19677120c4ba3ab1450cf34e740703c5a9ba1 WatchSource:0}: Error finding container 98229ed276f49eb4d4d64f84d4b19677120c4ba3ab1450cf34e740703c5a9ba1: Status 404 returned error can't find the container with id 98229ed276f49eb4d4d64f84d4b19677120c4ba3ab1450cf34e740703c5a9ba1 Nov 29 05:49:34 crc kubenswrapper[4594]: I1129 05:49:34.906675 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" event={"ID":"db8e580b-8fbe-4e91-bb94-023bf1b2903b","Type":"ContainerStarted","Data":"98229ed276f49eb4d4d64f84d4b19677120c4ba3ab1450cf34e740703c5a9ba1"} Nov 29 05:49:35 crc kubenswrapper[4594]: I1129 05:49:35.916791 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" event={"ID":"db8e580b-8fbe-4e91-bb94-023bf1b2903b","Type":"ContainerStarted","Data":"8617de6355d94bc3e85554bd99bf88b152e3aed46be8e64cbd1762fd71cb54fd"} Nov 29 05:49:37 crc kubenswrapper[4594]: I1129 05:49:37.944772 4594 generic.go:334] "Generic (PLEG): container finished" podID="db8e580b-8fbe-4e91-bb94-023bf1b2903b" containerID="8617de6355d94bc3e85554bd99bf88b152e3aed46be8e64cbd1762fd71cb54fd" exitCode=0 Nov 29 05:49:37 crc kubenswrapper[4594]: I1129 05:49:37.944868 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" event={"ID":"db8e580b-8fbe-4e91-bb94-023bf1b2903b","Type":"ContainerDied","Data":"8617de6355d94bc3e85554bd99bf88b152e3aed46be8e64cbd1762fd71cb54fd"} Nov 29 05:49:39 crc kubenswrapper[4594]: I1129 05:49:39.300820 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" Nov 29 05:49:39 crc kubenswrapper[4594]: I1129 05:49:39.315908 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db8e580b-8fbe-4e91-bb94-023bf1b2903b-ssh-key\") pod \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\" (UID: \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\") " Nov 29 05:49:39 crc kubenswrapper[4594]: I1129 05:49:39.316172 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqn7g\" (UniqueName: \"kubernetes.io/projected/db8e580b-8fbe-4e91-bb94-023bf1b2903b-kube-api-access-qqn7g\") pod \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\" (UID: \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\") " Nov 29 05:49:39 crc kubenswrapper[4594]: I1129 05:49:39.316269 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db8e580b-8fbe-4e91-bb94-023bf1b2903b-inventory\") pod \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\" (UID: \"db8e580b-8fbe-4e91-bb94-023bf1b2903b\") " Nov 29 05:49:39 crc kubenswrapper[4594]: I1129 05:49:39.323384 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8e580b-8fbe-4e91-bb94-023bf1b2903b-kube-api-access-qqn7g" (OuterVolumeSpecName: "kube-api-access-qqn7g") pod "db8e580b-8fbe-4e91-bb94-023bf1b2903b" (UID: "db8e580b-8fbe-4e91-bb94-023bf1b2903b"). InnerVolumeSpecName "kube-api-access-qqn7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:49:39 crc kubenswrapper[4594]: I1129 05:49:39.348044 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8e580b-8fbe-4e91-bb94-023bf1b2903b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "db8e580b-8fbe-4e91-bb94-023bf1b2903b" (UID: "db8e580b-8fbe-4e91-bb94-023bf1b2903b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:49:39 crc kubenswrapper[4594]: I1129 05:49:39.355450 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8e580b-8fbe-4e91-bb94-023bf1b2903b-inventory" (OuterVolumeSpecName: "inventory") pod "db8e580b-8fbe-4e91-bb94-023bf1b2903b" (UID: "db8e580b-8fbe-4e91-bb94-023bf1b2903b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:49:39 crc kubenswrapper[4594]: I1129 05:49:39.418391 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqn7g\" (UniqueName: \"kubernetes.io/projected/db8e580b-8fbe-4e91-bb94-023bf1b2903b-kube-api-access-qqn7g\") on node \"crc\" DevicePath \"\"" Nov 29 05:49:39 crc kubenswrapper[4594]: I1129 05:49:39.418419 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db8e580b-8fbe-4e91-bb94-023bf1b2903b-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 05:49:39 crc kubenswrapper[4594]: I1129 05:49:39.418428 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db8e580b-8fbe-4e91-bb94-023bf1b2903b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:49:39 crc kubenswrapper[4594]: I1129 05:49:39.971302 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" event={"ID":"db8e580b-8fbe-4e91-bb94-023bf1b2903b","Type":"ContainerDied","Data":"98229ed276f49eb4d4d64f84d4b19677120c4ba3ab1450cf34e740703c5a9ba1"} Nov 29 05:49:39 crc kubenswrapper[4594]: I1129 05:49:39.971327 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cxvz" Nov 29 05:49:39 crc kubenswrapper[4594]: I1129 05:49:39.971349 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98229ed276f49eb4d4d64f84d4b19677120c4ba3ab1450cf34e740703c5a9ba1" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.052631 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7"] Nov 29 05:49:40 crc kubenswrapper[4594]: E1129 05:49:40.053326 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8e580b-8fbe-4e91-bb94-023bf1b2903b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.053352 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8e580b-8fbe-4e91-bb94-023bf1b2903b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.053628 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="db8e580b-8fbe-4e91-bb94-023bf1b2903b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.054606 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.056168 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.056336 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.056988 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.057224 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.071805 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7"] Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.130649 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkj4n\" (UniqueName: \"kubernetes.io/projected/2a01c522-360e-4b2a-8b7e-4e5618fe1541-kube-api-access-tkj4n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.131313 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.131352 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.131386 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.232677 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.232717 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.232745 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.232826 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkj4n\" (UniqueName: \"kubernetes.io/projected/2a01c522-360e-4b2a-8b7e-4e5618fe1541-kube-api-access-tkj4n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.236942 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.237192 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.237285 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.247782 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkj4n\" (UniqueName: \"kubernetes.io/projected/2a01c522-360e-4b2a-8b7e-4e5618fe1541-kube-api-access-tkj4n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.369668 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.861184 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7"] Nov 29 05:49:40 crc kubenswrapper[4594]: I1129 05:49:40.983368 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" event={"ID":"2a01c522-360e-4b2a-8b7e-4e5618fe1541","Type":"ContainerStarted","Data":"8683b7826344b866cf90fa9018de9795ea02d20dcdb1c9d15f643c13a37099d4"} Nov 29 05:49:42 crc kubenswrapper[4594]: I1129 05:49:42.002746 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" event={"ID":"2a01c522-360e-4b2a-8b7e-4e5618fe1541","Type":"ContainerStarted","Data":"68ae68eb604ab8dfd9f14b91d57c6d8b880a8f7650a7749b42a518ddfc617017"} Nov 29 05:49:42 crc kubenswrapper[4594]: I1129 05:49:42.022731 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" podStartSLOduration=1.517879293 podStartE2EDuration="2.022712063s" podCreationTimestamp="2025-11-29 05:49:40 +0000 UTC" firstStartedPulling="2025-11-29 05:49:40.863381215 +0000 UTC m=+1305.103890435" lastFinishedPulling="2025-11-29 05:49:41.368213985 +0000 UTC m=+1305.608723205" observedRunningTime="2025-11-29 05:49:42.021426255 +0000 UTC m=+1306.261935475" watchObservedRunningTime="2025-11-29 05:49:42.022712063 +0000 UTC m=+1306.263221283" Nov 29 05:50:07 crc kubenswrapper[4594]: I1129 05:50:07.920485 4594 scope.go:117] "RemoveContainer" containerID="80e2351fc77071ab3107c08b054317392fa7aec8869a923516c2e7fc3d0ce807" Nov 29 05:50:07 crc kubenswrapper[4594]: I1129 05:50:07.958191 4594 scope.go:117] "RemoveContainer" containerID="aa7c7258a9785a7d4ad95ccb14f108b0183be8324a355e16389bb1cee662f909" Nov 29 05:50:07 crc kubenswrapper[4594]: I1129 05:50:07.989398 4594 scope.go:117] "RemoveContainer" containerID="2bbc778abfb96d55c368c4fde502e876ef21fc8d0f2f1f2cb0b31f3e4f1d255d" Nov 29 05:51:08 crc kubenswrapper[4594]: I1129 05:51:08.063536 4594 scope.go:117] "RemoveContainer" containerID="5bef5fd16ac3e60fef2c472e9c97a102398cb34978c0ec31423c236aaa1eca2b" Nov 29 05:51:08 crc kubenswrapper[4594]: I1129 05:51:08.108322 4594 scope.go:117] "RemoveContainer" containerID="8ec6438c2adbca016dc01629766d1303cc0f931ddf942b6080e48699d5838efa" Nov 29 05:51:08 crc kubenswrapper[4594]: I1129 05:51:08.128992 4594 scope.go:117] "RemoveContainer" containerID="a864656a788effdf9b2da5c2af82abd08d8dfc00247e49a48b84b882074bec97" Nov 29 05:51:45 crc kubenswrapper[4594]: I1129 05:51:45.800416 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:51:45 crc kubenswrapper[4594]: I1129 05:51:45.800791 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:52:08 crc kubenswrapper[4594]: I1129 05:52:08.226401 4594 scope.go:117] "RemoveContainer" containerID="3ad7251e66ba279f9ba73e88cc0726bdeae4e9f177beced32cda601809e8f140" Nov 29 05:52:08 crc kubenswrapper[4594]: I1129 05:52:08.248628 4594 scope.go:117] "RemoveContainer" containerID="88fb822152d931be0cec289aa8482d7f3c67e5ce6aad97d8be04955120761479" Nov 29 05:52:15 crc kubenswrapper[4594]: I1129 05:52:15.800089 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:52:15 crc kubenswrapper[4594]: I1129 05:52:15.800611 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:52:38 crc kubenswrapper[4594]: I1129 05:52:38.729387 4594 generic.go:334] "Generic (PLEG): container finished" podID="2a01c522-360e-4b2a-8b7e-4e5618fe1541" containerID="68ae68eb604ab8dfd9f14b91d57c6d8b880a8f7650a7749b42a518ddfc617017" exitCode=0 Nov 29 05:52:38 crc kubenswrapper[4594]: I1129 05:52:38.730036 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" event={"ID":"2a01c522-360e-4b2a-8b7e-4e5618fe1541","Type":"ContainerDied","Data":"68ae68eb604ab8dfd9f14b91d57c6d8b880a8f7650a7749b42a518ddfc617017"} Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.126355 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.134520 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-bootstrap-combined-ca-bundle\") pod \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.134889 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-ssh-key\") pod \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.135105 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkj4n\" (UniqueName: \"kubernetes.io/projected/2a01c522-360e-4b2a-8b7e-4e5618fe1541-kube-api-access-tkj4n\") pod \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.135155 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-inventory\") pod \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\" (UID: \"2a01c522-360e-4b2a-8b7e-4e5618fe1541\") " Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.142480 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2a01c522-360e-4b2a-8b7e-4e5618fe1541" (UID: "2a01c522-360e-4b2a-8b7e-4e5618fe1541"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.148473 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a01c522-360e-4b2a-8b7e-4e5618fe1541-kube-api-access-tkj4n" (OuterVolumeSpecName: "kube-api-access-tkj4n") pod "2a01c522-360e-4b2a-8b7e-4e5618fe1541" (UID: "2a01c522-360e-4b2a-8b7e-4e5618fe1541"). InnerVolumeSpecName "kube-api-access-tkj4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.165361 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2a01c522-360e-4b2a-8b7e-4e5618fe1541" (UID: "2a01c522-360e-4b2a-8b7e-4e5618fe1541"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.166502 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-inventory" (OuterVolumeSpecName: "inventory") pod "2a01c522-360e-4b2a-8b7e-4e5618fe1541" (UID: "2a01c522-360e-4b2a-8b7e-4e5618fe1541"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.237934 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.238088 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkj4n\" (UniqueName: \"kubernetes.io/projected/2a01c522-360e-4b2a-8b7e-4e5618fe1541-kube-api-access-tkj4n\") on node \"crc\" DevicePath \"\"" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.238152 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.238209 4594 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a01c522-360e-4b2a-8b7e-4e5618fe1541-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.753510 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" event={"ID":"2a01c522-360e-4b2a-8b7e-4e5618fe1541","Type":"ContainerDied","Data":"8683b7826344b866cf90fa9018de9795ea02d20dcdb1c9d15f643c13a37099d4"} Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.753911 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8683b7826344b866cf90fa9018de9795ea02d20dcdb1c9d15f643c13a37099d4" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.753602 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.824879 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52"] Nov 29 05:52:40 crc kubenswrapper[4594]: E1129 05:52:40.825428 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a01c522-360e-4b2a-8b7e-4e5618fe1541" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.825451 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a01c522-360e-4b2a-8b7e-4e5618fe1541" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.825701 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a01c522-360e-4b2a-8b7e-4e5618fe1541" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.826475 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.828388 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.831874 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.832021 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.832025 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.833169 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52"] Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.858175 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6345164d-91bd-47df-b5a6-71f9940c0f15-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8fq52\" (UID: \"6345164d-91bd-47df-b5a6-71f9940c0f15\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.858370 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6345164d-91bd-47df-b5a6-71f9940c0f15-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8fq52\" (UID: \"6345164d-91bd-47df-b5a6-71f9940c0f15\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.858657 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzqww\" (UniqueName: \"kubernetes.io/projected/6345164d-91bd-47df-b5a6-71f9940c0f15-kube-api-access-kzqww\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8fq52\" (UID: \"6345164d-91bd-47df-b5a6-71f9940c0f15\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.961310 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzqww\" (UniqueName: \"kubernetes.io/projected/6345164d-91bd-47df-b5a6-71f9940c0f15-kube-api-access-kzqww\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8fq52\" (UID: \"6345164d-91bd-47df-b5a6-71f9940c0f15\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.961468 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6345164d-91bd-47df-b5a6-71f9940c0f15-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8fq52\" (UID: \"6345164d-91bd-47df-b5a6-71f9940c0f15\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.961513 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6345164d-91bd-47df-b5a6-71f9940c0f15-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8fq52\" (UID: \"6345164d-91bd-47df-b5a6-71f9940c0f15\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.967908 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6345164d-91bd-47df-b5a6-71f9940c0f15-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8fq52\" (UID: \"6345164d-91bd-47df-b5a6-71f9940c0f15\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.968888 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6345164d-91bd-47df-b5a6-71f9940c0f15-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8fq52\" (UID: \"6345164d-91bd-47df-b5a6-71f9940c0f15\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" Nov 29 05:52:40 crc kubenswrapper[4594]: I1129 05:52:40.976954 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzqww\" (UniqueName: \"kubernetes.io/projected/6345164d-91bd-47df-b5a6-71f9940c0f15-kube-api-access-kzqww\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8fq52\" (UID: \"6345164d-91bd-47df-b5a6-71f9940c0f15\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" Nov 29 05:52:41 crc kubenswrapper[4594]: I1129 05:52:41.148610 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" Nov 29 05:52:41 crc kubenswrapper[4594]: I1129 05:52:41.631342 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52"] Nov 29 05:52:41 crc kubenswrapper[4594]: I1129 05:52:41.635435 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 05:52:41 crc kubenswrapper[4594]: I1129 05:52:41.764647 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" event={"ID":"6345164d-91bd-47df-b5a6-71f9940c0f15","Type":"ContainerStarted","Data":"bd81791004a317c8ac33d0bd55cfb30d555c1a015b39889dc696a0bd4cf625a0"} Nov 29 05:52:42 crc kubenswrapper[4594]: I1129 05:52:42.777058 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" event={"ID":"6345164d-91bd-47df-b5a6-71f9940c0f15","Type":"ContainerStarted","Data":"3a011c2608ab1efd3b23f39ad827d637bebae30f1b235a933b45eb8d6e9f9273"} Nov 29 05:52:42 crc kubenswrapper[4594]: I1129 05:52:42.798556 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" podStartSLOduration=2.168934156 podStartE2EDuration="2.798536119s" podCreationTimestamp="2025-11-29 05:52:40 +0000 UTC" firstStartedPulling="2025-11-29 05:52:41.634856409 +0000 UTC m=+1485.875365630" lastFinishedPulling="2025-11-29 05:52:42.264458372 +0000 UTC m=+1486.504967593" observedRunningTime="2025-11-29 05:52:42.793661572 +0000 UTC m=+1487.034170793" watchObservedRunningTime="2025-11-29 05:52:42.798536119 +0000 UTC m=+1487.039045339" Nov 29 05:52:45 crc kubenswrapper[4594]: I1129 05:52:45.800037 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:52:45 crc kubenswrapper[4594]: I1129 05:52:45.800289 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:52:45 crc kubenswrapper[4594]: I1129 05:52:45.800332 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:52:45 crc kubenswrapper[4594]: I1129 05:52:45.800933 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e872a7864afdb97a4545c838292279114599a596d9e2444041e74943a73c1a43"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 05:52:45 crc kubenswrapper[4594]: I1129 05:52:45.800988 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://e872a7864afdb97a4545c838292279114599a596d9e2444041e74943a73c1a43" gracePeriod=600 Nov 29 05:52:46 crc kubenswrapper[4594]: I1129 05:52:46.817770 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="e872a7864afdb97a4545c838292279114599a596d9e2444041e74943a73c1a43" exitCode=0 Nov 29 05:52:46 crc kubenswrapper[4594]: I1129 05:52:46.817831 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"e872a7864afdb97a4545c838292279114599a596d9e2444041e74943a73c1a43"} Nov 29 05:52:46 crc kubenswrapper[4594]: I1129 05:52:46.818505 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f"} Nov 29 05:52:46 crc kubenswrapper[4594]: I1129 05:52:46.818540 4594 scope.go:117] "RemoveContainer" containerID="8eaecd3b285f1b345905df03e8392cda698400ba526ee824406a745a137450c8" Nov 29 05:53:08 crc kubenswrapper[4594]: I1129 05:53:08.282048 4594 scope.go:117] "RemoveContainer" containerID="83d247213af92af5ac284728ceeab266aa90ddd1c9ceb9b0fa12cb37ed4c0650" Nov 29 05:53:08 crc kubenswrapper[4594]: I1129 05:53:08.305376 4594 scope.go:117] "RemoveContainer" containerID="7cc0989b780c16653bb2ac16babf1d27f470d54ac007ef271f44832f65429d83" Nov 29 05:53:26 crc kubenswrapper[4594]: I1129 05:53:26.037897 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-vcfr2"] Nov 29 05:53:26 crc kubenswrapper[4594]: I1129 05:53:26.046215 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3b14-account-create-update-bk8sl"] Nov 29 05:53:26 crc kubenswrapper[4594]: I1129 05:53:26.054749 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6fef-account-create-update-8cksk"] Nov 29 05:53:26 crc kubenswrapper[4594]: I1129 05:53:26.062488 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-vcfr2"] Nov 29 05:53:26 crc kubenswrapper[4594]: I1129 05:53:26.071508 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6fef-account-create-update-8cksk"] Nov 29 05:53:26 crc kubenswrapper[4594]: I1129 05:53:26.079034 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3b14-account-create-update-bk8sl"] Nov 29 05:53:26 crc kubenswrapper[4594]: I1129 05:53:26.094516 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="652d562d-549c-45e1-ad93-325a607548c6" path="/var/lib/kubelet/pods/652d562d-549c-45e1-ad93-325a607548c6/volumes" Nov 29 05:53:26 crc kubenswrapper[4594]: I1129 05:53:26.095147 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a76725-4bb7-4fa5-98e9-066434887870" path="/var/lib/kubelet/pods/72a76725-4bb7-4fa5-98e9-066434887870/volumes" Nov 29 05:53:26 crc kubenswrapper[4594]: I1129 05:53:26.095697 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd700633-7336-45dc-a346-4740e6605784" path="/var/lib/kubelet/pods/cd700633-7336-45dc-a346-4740e6605784/volumes" Nov 29 05:53:27 crc kubenswrapper[4594]: I1129 05:53:27.028166 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-8bpck"] Nov 29 05:53:27 crc kubenswrapper[4594]: I1129 05:53:27.036638 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-8bpck"] Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.029585 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-55dd-account-create-update-5b9fv"] Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.036451 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-55dd-account-create-update-5b9fv"] Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.048225 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vvgxh"] Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.050874 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.060588 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vvgxh"] Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.095843 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b6efdb5-647a-4e53-b1c7-0b48c0acc137" path="/var/lib/kubelet/pods/8b6efdb5-647a-4e53-b1c7-0b48c0acc137/volumes" Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.096458 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b1647cf-8657-466a-8579-2ce1f5ac45a3" path="/var/lib/kubelet/pods/9b1647cf-8657-466a-8579-2ce1f5ac45a3/volumes" Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.117808 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6e125a8-1a04-40b5-a626-ccf6aac788d1-catalog-content\") pod \"community-operators-vvgxh\" (UID: \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\") " pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.117994 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxp7k\" (UniqueName: \"kubernetes.io/projected/c6e125a8-1a04-40b5-a626-ccf6aac788d1-kube-api-access-cxp7k\") pod \"community-operators-vvgxh\" (UID: \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\") " pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.118473 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6e125a8-1a04-40b5-a626-ccf6aac788d1-utilities\") pod \"community-operators-vvgxh\" (UID: \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\") " pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.220623 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6e125a8-1a04-40b5-a626-ccf6aac788d1-catalog-content\") pod \"community-operators-vvgxh\" (UID: \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\") " pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.220679 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxp7k\" (UniqueName: \"kubernetes.io/projected/c6e125a8-1a04-40b5-a626-ccf6aac788d1-kube-api-access-cxp7k\") pod \"community-operators-vvgxh\" (UID: \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\") " pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.220819 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6e125a8-1a04-40b5-a626-ccf6aac788d1-utilities\") pod \"community-operators-vvgxh\" (UID: \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\") " pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.221077 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6e125a8-1a04-40b5-a626-ccf6aac788d1-catalog-content\") pod \"community-operators-vvgxh\" (UID: \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\") " pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.221188 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6e125a8-1a04-40b5-a626-ccf6aac788d1-utilities\") pod \"community-operators-vvgxh\" (UID: \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\") " pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.237302 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxp7k\" (UniqueName: \"kubernetes.io/projected/c6e125a8-1a04-40b5-a626-ccf6aac788d1-kube-api-access-cxp7k\") pod \"community-operators-vvgxh\" (UID: \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\") " pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.368892 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:28 crc kubenswrapper[4594]: I1129 05:53:28.805538 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vvgxh"] Nov 29 05:53:29 crc kubenswrapper[4594]: I1129 05:53:29.025197 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-8bxs9"] Nov 29 05:53:29 crc kubenswrapper[4594]: I1129 05:53:29.034888 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-8bxs9"] Nov 29 05:53:29 crc kubenswrapper[4594]: I1129 05:53:29.278785 4594 generic.go:334] "Generic (PLEG): container finished" podID="c6e125a8-1a04-40b5-a626-ccf6aac788d1" containerID="2fcac6d5b5aeb09d7fc9717bf5cf6070f24b9d4decd552824195a1aee5b02724" exitCode=0 Nov 29 05:53:29 crc kubenswrapper[4594]: I1129 05:53:29.278904 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvgxh" event={"ID":"c6e125a8-1a04-40b5-a626-ccf6aac788d1","Type":"ContainerDied","Data":"2fcac6d5b5aeb09d7fc9717bf5cf6070f24b9d4decd552824195a1aee5b02724"} Nov 29 05:53:29 crc kubenswrapper[4594]: I1129 05:53:29.279197 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvgxh" event={"ID":"c6e125a8-1a04-40b5-a626-ccf6aac788d1","Type":"ContainerStarted","Data":"a2ee9414dc7438bd205f22c5cfa75a40e62257dc9e0e2c49969a327307ca4a96"} Nov 29 05:53:30 crc kubenswrapper[4594]: I1129 05:53:30.095595 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37693f9-769d-409e-a90a-946d5fbe00d9" path="/var/lib/kubelet/pods/e37693f9-769d-409e-a90a-946d5fbe00d9/volumes" Nov 29 05:53:30 crc kubenswrapper[4594]: I1129 05:53:30.291284 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvgxh" event={"ID":"c6e125a8-1a04-40b5-a626-ccf6aac788d1","Type":"ContainerStarted","Data":"93ace5679bd236158e386190f0ff0c7cc065a74c900e6fa3cc6d73cb194a0711"} Nov 29 05:53:31 crc kubenswrapper[4594]: I1129 05:53:31.301822 4594 generic.go:334] "Generic (PLEG): container finished" podID="c6e125a8-1a04-40b5-a626-ccf6aac788d1" containerID="93ace5679bd236158e386190f0ff0c7cc065a74c900e6fa3cc6d73cb194a0711" exitCode=0 Nov 29 05:53:31 crc kubenswrapper[4594]: I1129 05:53:31.301929 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvgxh" event={"ID":"c6e125a8-1a04-40b5-a626-ccf6aac788d1","Type":"ContainerDied","Data":"93ace5679bd236158e386190f0ff0c7cc065a74c900e6fa3cc6d73cb194a0711"} Nov 29 05:53:32 crc kubenswrapper[4594]: I1129 05:53:32.314628 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvgxh" event={"ID":"c6e125a8-1a04-40b5-a626-ccf6aac788d1","Type":"ContainerStarted","Data":"6b8b2d48413ad51c44d642090a9cdfaa62a8016bc7740ebfb3cf7d477c203167"} Nov 29 05:53:32 crc kubenswrapper[4594]: I1129 05:53:32.336274 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vvgxh" podStartSLOduration=1.803960632 podStartE2EDuration="4.33623347s" podCreationTimestamp="2025-11-29 05:53:28 +0000 UTC" firstStartedPulling="2025-11-29 05:53:29.280596383 +0000 UTC m=+1533.521105604" lastFinishedPulling="2025-11-29 05:53:31.812869221 +0000 UTC m=+1536.053378442" observedRunningTime="2025-11-29 05:53:32.329446236 +0000 UTC m=+1536.569955456" watchObservedRunningTime="2025-11-29 05:53:32.33623347 +0000 UTC m=+1536.576742689" Nov 29 05:53:34 crc kubenswrapper[4594]: I1129 05:53:34.627294 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-92jc9"] Nov 29 05:53:34 crc kubenswrapper[4594]: I1129 05:53:34.631025 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:34 crc kubenswrapper[4594]: I1129 05:53:34.648969 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-92jc9"] Nov 29 05:53:34 crc kubenswrapper[4594]: I1129 05:53:34.773521 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-catalog-content\") pod \"redhat-operators-92jc9\" (UID: \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\") " pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:34 crc kubenswrapper[4594]: I1129 05:53:34.773598 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj894\" (UniqueName: \"kubernetes.io/projected/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-kube-api-access-lj894\") pod \"redhat-operators-92jc9\" (UID: \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\") " pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:34 crc kubenswrapper[4594]: I1129 05:53:34.773620 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-utilities\") pod \"redhat-operators-92jc9\" (UID: \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\") " pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:34 crc kubenswrapper[4594]: I1129 05:53:34.876718 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-catalog-content\") pod \"redhat-operators-92jc9\" (UID: \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\") " pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:34 crc kubenswrapper[4594]: I1129 05:53:34.876802 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj894\" (UniqueName: \"kubernetes.io/projected/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-kube-api-access-lj894\") pod \"redhat-operators-92jc9\" (UID: \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\") " pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:34 crc kubenswrapper[4594]: I1129 05:53:34.876860 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-utilities\") pod \"redhat-operators-92jc9\" (UID: \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\") " pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:34 crc kubenswrapper[4594]: I1129 05:53:34.877118 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-catalog-content\") pod \"redhat-operators-92jc9\" (UID: \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\") " pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:34 crc kubenswrapper[4594]: I1129 05:53:34.877368 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-utilities\") pod \"redhat-operators-92jc9\" (UID: \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\") " pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:34 crc kubenswrapper[4594]: I1129 05:53:34.894620 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj894\" (UniqueName: \"kubernetes.io/projected/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-kube-api-access-lj894\") pod \"redhat-operators-92jc9\" (UID: \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\") " pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:34 crc kubenswrapper[4594]: I1129 05:53:34.954558 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:35 crc kubenswrapper[4594]: I1129 05:53:35.414932 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-92jc9"] Nov 29 05:53:35 crc kubenswrapper[4594]: W1129 05:53:35.422665 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf1ac69f_ea15_4f49_b007_d9bc12b00c9e.slice/crio-8c678c8c1ebcaceec2a863b13a923433d9fda78af78fd965d293368bf8114c0d WatchSource:0}: Error finding container 8c678c8c1ebcaceec2a863b13a923433d9fda78af78fd965d293368bf8114c0d: Status 404 returned error can't find the container with id 8c678c8c1ebcaceec2a863b13a923433d9fda78af78fd965d293368bf8114c0d Nov 29 05:53:36 crc kubenswrapper[4594]: I1129 05:53:36.358836 4594 generic.go:334] "Generic (PLEG): container finished" podID="bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" containerID="7934eae12b869216026289273d05a945a78b29e9db6b2c4c6278382b41feb41b" exitCode=0 Nov 29 05:53:36 crc kubenswrapper[4594]: I1129 05:53:36.358915 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92jc9" event={"ID":"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e","Type":"ContainerDied","Data":"7934eae12b869216026289273d05a945a78b29e9db6b2c4c6278382b41feb41b"} Nov 29 05:53:36 crc kubenswrapper[4594]: I1129 05:53:36.360204 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92jc9" event={"ID":"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e","Type":"ContainerStarted","Data":"8c678c8c1ebcaceec2a863b13a923433d9fda78af78fd965d293368bf8114c0d"} Nov 29 05:53:37 crc kubenswrapper[4594]: I1129 05:53:37.376516 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92jc9" event={"ID":"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e","Type":"ContainerStarted","Data":"82985b1da7d3e426989b10f41186d34010c3ec19597589fa88b86eab96601f91"} Nov 29 05:53:38 crc kubenswrapper[4594]: I1129 05:53:38.369733 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:38 crc kubenswrapper[4594]: I1129 05:53:38.370209 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:38 crc kubenswrapper[4594]: I1129 05:53:38.410523 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:38 crc kubenswrapper[4594]: I1129 05:53:38.454528 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:39 crc kubenswrapper[4594]: I1129 05:53:39.400345 4594 generic.go:334] "Generic (PLEG): container finished" podID="bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" containerID="82985b1da7d3e426989b10f41186d34010c3ec19597589fa88b86eab96601f91" exitCode=0 Nov 29 05:53:39 crc kubenswrapper[4594]: I1129 05:53:39.400539 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92jc9" event={"ID":"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e","Type":"ContainerDied","Data":"82985b1da7d3e426989b10f41186d34010c3ec19597589fa88b86eab96601f91"} Nov 29 05:53:39 crc kubenswrapper[4594]: I1129 05:53:39.619859 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vvgxh"] Nov 29 05:53:40 crc kubenswrapper[4594]: I1129 05:53:40.417173 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92jc9" event={"ID":"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e","Type":"ContainerStarted","Data":"0cf92b217167d22146c165b872a654d7a865b46ad8458f769c4aa09b9d108e5b"} Nov 29 05:53:40 crc kubenswrapper[4594]: I1129 05:53:40.417406 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vvgxh" podUID="c6e125a8-1a04-40b5-a626-ccf6aac788d1" containerName="registry-server" containerID="cri-o://6b8b2d48413ad51c44d642090a9cdfaa62a8016bc7740ebfb3cf7d477c203167" gracePeriod=2 Nov 29 05:53:40 crc kubenswrapper[4594]: I1129 05:53:40.438498 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-92jc9" podStartSLOduration=2.820721277 podStartE2EDuration="6.438483877s" podCreationTimestamp="2025-11-29 05:53:34 +0000 UTC" firstStartedPulling="2025-11-29 05:53:36.362314169 +0000 UTC m=+1540.602823389" lastFinishedPulling="2025-11-29 05:53:39.98007677 +0000 UTC m=+1544.220585989" observedRunningTime="2025-11-29 05:53:40.435812643 +0000 UTC m=+1544.676321863" watchObservedRunningTime="2025-11-29 05:53:40.438483877 +0000 UTC m=+1544.678993097" Nov 29 05:53:40 crc kubenswrapper[4594]: I1129 05:53:40.851303 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:40 crc kubenswrapper[4594]: I1129 05:53:40.923810 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6e125a8-1a04-40b5-a626-ccf6aac788d1-utilities\") pod \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\" (UID: \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\") " Nov 29 05:53:40 crc kubenswrapper[4594]: I1129 05:53:40.923997 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxp7k\" (UniqueName: \"kubernetes.io/projected/c6e125a8-1a04-40b5-a626-ccf6aac788d1-kube-api-access-cxp7k\") pod \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\" (UID: \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\") " Nov 29 05:53:40 crc kubenswrapper[4594]: I1129 05:53:40.924152 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6e125a8-1a04-40b5-a626-ccf6aac788d1-catalog-content\") pod \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\" (UID: \"c6e125a8-1a04-40b5-a626-ccf6aac788d1\") " Nov 29 05:53:40 crc kubenswrapper[4594]: I1129 05:53:40.924479 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6e125a8-1a04-40b5-a626-ccf6aac788d1-utilities" (OuterVolumeSpecName: "utilities") pod "c6e125a8-1a04-40b5-a626-ccf6aac788d1" (UID: "c6e125a8-1a04-40b5-a626-ccf6aac788d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:53:40 crc kubenswrapper[4594]: I1129 05:53:40.924947 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6e125a8-1a04-40b5-a626-ccf6aac788d1-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:53:40 crc kubenswrapper[4594]: I1129 05:53:40.930516 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6e125a8-1a04-40b5-a626-ccf6aac788d1-kube-api-access-cxp7k" (OuterVolumeSpecName: "kube-api-access-cxp7k") pod "c6e125a8-1a04-40b5-a626-ccf6aac788d1" (UID: "c6e125a8-1a04-40b5-a626-ccf6aac788d1"). InnerVolumeSpecName "kube-api-access-cxp7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:53:40 crc kubenswrapper[4594]: I1129 05:53:40.961930 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6e125a8-1a04-40b5-a626-ccf6aac788d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6e125a8-1a04-40b5-a626-ccf6aac788d1" (UID: "c6e125a8-1a04-40b5-a626-ccf6aac788d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.027135 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxp7k\" (UniqueName: \"kubernetes.io/projected/c6e125a8-1a04-40b5-a626-ccf6aac788d1-kube-api-access-cxp7k\") on node \"crc\" DevicePath \"\"" Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.027170 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6e125a8-1a04-40b5-a626-ccf6aac788d1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.431093 4594 generic.go:334] "Generic (PLEG): container finished" podID="c6e125a8-1a04-40b5-a626-ccf6aac788d1" containerID="6b8b2d48413ad51c44d642090a9cdfaa62a8016bc7740ebfb3cf7d477c203167" exitCode=0 Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.431156 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvgxh" event={"ID":"c6e125a8-1a04-40b5-a626-ccf6aac788d1","Type":"ContainerDied","Data":"6b8b2d48413ad51c44d642090a9cdfaa62a8016bc7740ebfb3cf7d477c203167"} Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.431168 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvgxh" Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.431197 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvgxh" event={"ID":"c6e125a8-1a04-40b5-a626-ccf6aac788d1","Type":"ContainerDied","Data":"a2ee9414dc7438bd205f22c5cfa75a40e62257dc9e0e2c49969a327307ca4a96"} Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.431218 4594 scope.go:117] "RemoveContainer" containerID="6b8b2d48413ad51c44d642090a9cdfaa62a8016bc7740ebfb3cf7d477c203167" Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.457072 4594 scope.go:117] "RemoveContainer" containerID="93ace5679bd236158e386190f0ff0c7cc065a74c900e6fa3cc6d73cb194a0711" Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.464316 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vvgxh"] Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.474013 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vvgxh"] Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.483690 4594 scope.go:117] "RemoveContainer" containerID="2fcac6d5b5aeb09d7fc9717bf5cf6070f24b9d4decd552824195a1aee5b02724" Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.530620 4594 scope.go:117] "RemoveContainer" containerID="6b8b2d48413ad51c44d642090a9cdfaa62a8016bc7740ebfb3cf7d477c203167" Nov 29 05:53:41 crc kubenswrapper[4594]: E1129 05:53:41.531003 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b8b2d48413ad51c44d642090a9cdfaa62a8016bc7740ebfb3cf7d477c203167\": container with ID starting with 6b8b2d48413ad51c44d642090a9cdfaa62a8016bc7740ebfb3cf7d477c203167 not found: ID does not exist" containerID="6b8b2d48413ad51c44d642090a9cdfaa62a8016bc7740ebfb3cf7d477c203167" Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.531039 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b8b2d48413ad51c44d642090a9cdfaa62a8016bc7740ebfb3cf7d477c203167"} err="failed to get container status \"6b8b2d48413ad51c44d642090a9cdfaa62a8016bc7740ebfb3cf7d477c203167\": rpc error: code = NotFound desc = could not find container \"6b8b2d48413ad51c44d642090a9cdfaa62a8016bc7740ebfb3cf7d477c203167\": container with ID starting with 6b8b2d48413ad51c44d642090a9cdfaa62a8016bc7740ebfb3cf7d477c203167 not found: ID does not exist" Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.531068 4594 scope.go:117] "RemoveContainer" containerID="93ace5679bd236158e386190f0ff0c7cc065a74c900e6fa3cc6d73cb194a0711" Nov 29 05:53:41 crc kubenswrapper[4594]: E1129 05:53:41.531330 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ace5679bd236158e386190f0ff0c7cc065a74c900e6fa3cc6d73cb194a0711\": container with ID starting with 93ace5679bd236158e386190f0ff0c7cc065a74c900e6fa3cc6d73cb194a0711 not found: ID does not exist" containerID="93ace5679bd236158e386190f0ff0c7cc065a74c900e6fa3cc6d73cb194a0711" Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.531361 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ace5679bd236158e386190f0ff0c7cc065a74c900e6fa3cc6d73cb194a0711"} err="failed to get container status \"93ace5679bd236158e386190f0ff0c7cc065a74c900e6fa3cc6d73cb194a0711\": rpc error: code = NotFound desc = could not find container \"93ace5679bd236158e386190f0ff0c7cc065a74c900e6fa3cc6d73cb194a0711\": container with ID starting with 93ace5679bd236158e386190f0ff0c7cc065a74c900e6fa3cc6d73cb194a0711 not found: ID does not exist" Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.531379 4594 scope.go:117] "RemoveContainer" containerID="2fcac6d5b5aeb09d7fc9717bf5cf6070f24b9d4decd552824195a1aee5b02724" Nov 29 05:53:41 crc kubenswrapper[4594]: E1129 05:53:41.531621 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fcac6d5b5aeb09d7fc9717bf5cf6070f24b9d4decd552824195a1aee5b02724\": container with ID starting with 2fcac6d5b5aeb09d7fc9717bf5cf6070f24b9d4decd552824195a1aee5b02724 not found: ID does not exist" containerID="2fcac6d5b5aeb09d7fc9717bf5cf6070f24b9d4decd552824195a1aee5b02724" Nov 29 05:53:41 crc kubenswrapper[4594]: I1129 05:53:41.531647 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fcac6d5b5aeb09d7fc9717bf5cf6070f24b9d4decd552824195a1aee5b02724"} err="failed to get container status \"2fcac6d5b5aeb09d7fc9717bf5cf6070f24b9d4decd552824195a1aee5b02724\": rpc error: code = NotFound desc = could not find container \"2fcac6d5b5aeb09d7fc9717bf5cf6070f24b9d4decd552824195a1aee5b02724\": container with ID starting with 2fcac6d5b5aeb09d7fc9717bf5cf6070f24b9d4decd552824195a1aee5b02724 not found: ID does not exist" Nov 29 05:53:42 crc kubenswrapper[4594]: I1129 05:53:42.096334 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6e125a8-1a04-40b5-a626-ccf6aac788d1" path="/var/lib/kubelet/pods/c6e125a8-1a04-40b5-a626-ccf6aac788d1/volumes" Nov 29 05:53:44 crc kubenswrapper[4594]: I1129 05:53:44.954851 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:44 crc kubenswrapper[4594]: I1129 05:53:44.956477 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:45 crc kubenswrapper[4594]: I1129 05:53:45.000159 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:45 crc kubenswrapper[4594]: I1129 05:53:45.509524 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:45 crc kubenswrapper[4594]: I1129 05:53:45.818424 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-92jc9"] Nov 29 05:53:47 crc kubenswrapper[4594]: I1129 05:53:47.490092 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-92jc9" podUID="bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" containerName="registry-server" containerID="cri-o://0cf92b217167d22146c165b872a654d7a865b46ad8458f769c4aa09b9d108e5b" gracePeriod=2 Nov 29 05:53:47 crc kubenswrapper[4594]: I1129 05:53:47.955648 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:47 crc kubenswrapper[4594]: I1129 05:53:47.976291 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj894\" (UniqueName: \"kubernetes.io/projected/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-kube-api-access-lj894\") pod \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\" (UID: \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\") " Nov 29 05:53:47 crc kubenswrapper[4594]: I1129 05:53:47.976373 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-utilities\") pod \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\" (UID: \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\") " Nov 29 05:53:47 crc kubenswrapper[4594]: I1129 05:53:47.976439 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-catalog-content\") pod \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\" (UID: \"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e\") " Nov 29 05:53:47 crc kubenswrapper[4594]: I1129 05:53:47.977294 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-utilities" (OuterVolumeSpecName: "utilities") pod "bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" (UID: "bf1ac69f-ea15-4f49-b007-d9bc12b00c9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:53:47 crc kubenswrapper[4594]: I1129 05:53:47.984362 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-kube-api-access-lj894" (OuterVolumeSpecName: "kube-api-access-lj894") pod "bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" (UID: "bf1ac69f-ea15-4f49-b007-d9bc12b00c9e"). InnerVolumeSpecName "kube-api-access-lj894". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.076023 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" (UID: "bf1ac69f-ea15-4f49-b007-d9bc12b00c9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.079008 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj894\" (UniqueName: \"kubernetes.io/projected/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-kube-api-access-lj894\") on node \"crc\" DevicePath \"\"" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.079042 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.079053 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.503139 4594 generic.go:334] "Generic (PLEG): container finished" podID="bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" containerID="0cf92b217167d22146c165b872a654d7a865b46ad8458f769c4aa09b9d108e5b" exitCode=0 Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.503218 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92jc9" event={"ID":"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e","Type":"ContainerDied","Data":"0cf92b217167d22146c165b872a654d7a865b46ad8458f769c4aa09b9d108e5b"} Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.503240 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92jc9" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.503291 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92jc9" event={"ID":"bf1ac69f-ea15-4f49-b007-d9bc12b00c9e","Type":"ContainerDied","Data":"8c678c8c1ebcaceec2a863b13a923433d9fda78af78fd965d293368bf8114c0d"} Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.503321 4594 scope.go:117] "RemoveContainer" containerID="0cf92b217167d22146c165b872a654d7a865b46ad8458f769c4aa09b9d108e5b" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.536598 4594 scope.go:117] "RemoveContainer" containerID="82985b1da7d3e426989b10f41186d34010c3ec19597589fa88b86eab96601f91" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.557392 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-92jc9"] Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.559180 4594 scope.go:117] "RemoveContainer" containerID="7934eae12b869216026289273d05a945a78b29e9db6b2c4c6278382b41feb41b" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.568864 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-92jc9"] Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.592133 4594 scope.go:117] "RemoveContainer" containerID="0cf92b217167d22146c165b872a654d7a865b46ad8458f769c4aa09b9d108e5b" Nov 29 05:53:48 crc kubenswrapper[4594]: E1129 05:53:48.592492 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf92b217167d22146c165b872a654d7a865b46ad8458f769c4aa09b9d108e5b\": container with ID starting with 0cf92b217167d22146c165b872a654d7a865b46ad8458f769c4aa09b9d108e5b not found: ID does not exist" containerID="0cf92b217167d22146c165b872a654d7a865b46ad8458f769c4aa09b9d108e5b" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.592596 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf92b217167d22146c165b872a654d7a865b46ad8458f769c4aa09b9d108e5b"} err="failed to get container status \"0cf92b217167d22146c165b872a654d7a865b46ad8458f769c4aa09b9d108e5b\": rpc error: code = NotFound desc = could not find container \"0cf92b217167d22146c165b872a654d7a865b46ad8458f769c4aa09b9d108e5b\": container with ID starting with 0cf92b217167d22146c165b872a654d7a865b46ad8458f769c4aa09b9d108e5b not found: ID does not exist" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.592680 4594 scope.go:117] "RemoveContainer" containerID="82985b1da7d3e426989b10f41186d34010c3ec19597589fa88b86eab96601f91" Nov 29 05:53:48 crc kubenswrapper[4594]: E1129 05:53:48.593066 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82985b1da7d3e426989b10f41186d34010c3ec19597589fa88b86eab96601f91\": container with ID starting with 82985b1da7d3e426989b10f41186d34010c3ec19597589fa88b86eab96601f91 not found: ID does not exist" containerID="82985b1da7d3e426989b10f41186d34010c3ec19597589fa88b86eab96601f91" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.593103 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82985b1da7d3e426989b10f41186d34010c3ec19597589fa88b86eab96601f91"} err="failed to get container status \"82985b1da7d3e426989b10f41186d34010c3ec19597589fa88b86eab96601f91\": rpc error: code = NotFound desc = could not find container \"82985b1da7d3e426989b10f41186d34010c3ec19597589fa88b86eab96601f91\": container with ID starting with 82985b1da7d3e426989b10f41186d34010c3ec19597589fa88b86eab96601f91 not found: ID does not exist" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.593131 4594 scope.go:117] "RemoveContainer" containerID="7934eae12b869216026289273d05a945a78b29e9db6b2c4c6278382b41feb41b" Nov 29 05:53:48 crc kubenswrapper[4594]: E1129 05:53:48.593598 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7934eae12b869216026289273d05a945a78b29e9db6b2c4c6278382b41feb41b\": container with ID starting with 7934eae12b869216026289273d05a945a78b29e9db6b2c4c6278382b41feb41b not found: ID does not exist" containerID="7934eae12b869216026289273d05a945a78b29e9db6b2c4c6278382b41feb41b" Nov 29 05:53:48 crc kubenswrapper[4594]: I1129 05:53:48.593670 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7934eae12b869216026289273d05a945a78b29e9db6b2c4c6278382b41feb41b"} err="failed to get container status \"7934eae12b869216026289273d05a945a78b29e9db6b2c4c6278382b41feb41b\": rpc error: code = NotFound desc = could not find container \"7934eae12b869216026289273d05a945a78b29e9db6b2c4c6278382b41feb41b\": container with ID starting with 7934eae12b869216026289273d05a945a78b29e9db6b2c4c6278382b41feb41b not found: ID does not exist" Nov 29 05:53:50 crc kubenswrapper[4594]: I1129 05:53:50.094409 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" path="/var/lib/kubelet/pods/bf1ac69f-ea15-4f49-b007-d9bc12b00c9e/volumes" Nov 29 05:53:57 crc kubenswrapper[4594]: I1129 05:53:57.959728 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ssk47"] Nov 29 05:53:57 crc kubenswrapper[4594]: E1129 05:53:57.960731 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e125a8-1a04-40b5-a626-ccf6aac788d1" containerName="extract-utilities" Nov 29 05:53:57 crc kubenswrapper[4594]: I1129 05:53:57.960747 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e125a8-1a04-40b5-a626-ccf6aac788d1" containerName="extract-utilities" Nov 29 05:53:57 crc kubenswrapper[4594]: E1129 05:53:57.960756 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" containerName="extract-utilities" Nov 29 05:53:57 crc kubenswrapper[4594]: I1129 05:53:57.960762 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" containerName="extract-utilities" Nov 29 05:53:57 crc kubenswrapper[4594]: E1129 05:53:57.960790 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e125a8-1a04-40b5-a626-ccf6aac788d1" containerName="extract-content" Nov 29 05:53:57 crc kubenswrapper[4594]: I1129 05:53:57.960796 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e125a8-1a04-40b5-a626-ccf6aac788d1" containerName="extract-content" Nov 29 05:53:57 crc kubenswrapper[4594]: E1129 05:53:57.960808 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e125a8-1a04-40b5-a626-ccf6aac788d1" containerName="registry-server" Nov 29 05:53:57 crc kubenswrapper[4594]: I1129 05:53:57.960813 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e125a8-1a04-40b5-a626-ccf6aac788d1" containerName="registry-server" Nov 29 05:53:57 crc kubenswrapper[4594]: E1129 05:53:57.960823 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" containerName="registry-server" Nov 29 05:53:57 crc kubenswrapper[4594]: I1129 05:53:57.960830 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" containerName="registry-server" Nov 29 05:53:57 crc kubenswrapper[4594]: E1129 05:53:57.960848 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" containerName="extract-content" Nov 29 05:53:57 crc kubenswrapper[4594]: I1129 05:53:57.960854 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" containerName="extract-content" Nov 29 05:53:57 crc kubenswrapper[4594]: I1129 05:53:57.961062 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1ac69f-ea15-4f49-b007-d9bc12b00c9e" containerName="registry-server" Nov 29 05:53:57 crc kubenswrapper[4594]: I1129 05:53:57.961076 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e125a8-1a04-40b5-a626-ccf6aac788d1" containerName="registry-server" Nov 29 05:53:57 crc kubenswrapper[4594]: I1129 05:53:57.962556 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:53:57 crc kubenswrapper[4594]: I1129 05:53:57.969607 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ssk47"] Nov 29 05:53:58 crc kubenswrapper[4594]: I1129 05:53:58.006783 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551c161f-3e49-44e1-a572-5c5a33fa37f1-utilities\") pod \"certified-operators-ssk47\" (UID: \"551c161f-3e49-44e1-a572-5c5a33fa37f1\") " pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:53:58 crc kubenswrapper[4594]: I1129 05:53:58.006875 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551c161f-3e49-44e1-a572-5c5a33fa37f1-catalog-content\") pod \"certified-operators-ssk47\" (UID: \"551c161f-3e49-44e1-a572-5c5a33fa37f1\") " pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:53:58 crc kubenswrapper[4594]: I1129 05:53:58.007110 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqjtw\" (UniqueName: \"kubernetes.io/projected/551c161f-3e49-44e1-a572-5c5a33fa37f1-kube-api-access-lqjtw\") pod \"certified-operators-ssk47\" (UID: \"551c161f-3e49-44e1-a572-5c5a33fa37f1\") " pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:53:58 crc kubenswrapper[4594]: I1129 05:53:58.108654 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551c161f-3e49-44e1-a572-5c5a33fa37f1-utilities\") pod \"certified-operators-ssk47\" (UID: \"551c161f-3e49-44e1-a572-5c5a33fa37f1\") " pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:53:58 crc kubenswrapper[4594]: I1129 05:53:58.108711 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551c161f-3e49-44e1-a572-5c5a33fa37f1-catalog-content\") pod \"certified-operators-ssk47\" (UID: \"551c161f-3e49-44e1-a572-5c5a33fa37f1\") " pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:53:58 crc kubenswrapper[4594]: I1129 05:53:58.108814 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqjtw\" (UniqueName: \"kubernetes.io/projected/551c161f-3e49-44e1-a572-5c5a33fa37f1-kube-api-access-lqjtw\") pod \"certified-operators-ssk47\" (UID: \"551c161f-3e49-44e1-a572-5c5a33fa37f1\") " pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:53:58 crc kubenswrapper[4594]: I1129 05:53:58.109232 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551c161f-3e49-44e1-a572-5c5a33fa37f1-utilities\") pod \"certified-operators-ssk47\" (UID: \"551c161f-3e49-44e1-a572-5c5a33fa37f1\") " pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:53:58 crc kubenswrapper[4594]: I1129 05:53:58.109636 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551c161f-3e49-44e1-a572-5c5a33fa37f1-catalog-content\") pod \"certified-operators-ssk47\" (UID: \"551c161f-3e49-44e1-a572-5c5a33fa37f1\") " pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:53:58 crc kubenswrapper[4594]: I1129 05:53:58.129108 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqjtw\" (UniqueName: \"kubernetes.io/projected/551c161f-3e49-44e1-a572-5c5a33fa37f1-kube-api-access-lqjtw\") pod \"certified-operators-ssk47\" (UID: \"551c161f-3e49-44e1-a572-5c5a33fa37f1\") " pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:53:58 crc kubenswrapper[4594]: I1129 05:53:58.291568 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:53:58 crc kubenswrapper[4594]: I1129 05:53:58.769133 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ssk47"] Nov 29 05:53:59 crc kubenswrapper[4594]: I1129 05:53:59.616505 4594 generic.go:334] "Generic (PLEG): container finished" podID="551c161f-3e49-44e1-a572-5c5a33fa37f1" containerID="a8239548ad836c3cb9d51f52c67cb2030d2498b913dc835167b03a27d877bcd8" exitCode=0 Nov 29 05:53:59 crc kubenswrapper[4594]: I1129 05:53:59.616597 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssk47" event={"ID":"551c161f-3e49-44e1-a572-5c5a33fa37f1","Type":"ContainerDied","Data":"a8239548ad836c3cb9d51f52c67cb2030d2498b913dc835167b03a27d877bcd8"} Nov 29 05:53:59 crc kubenswrapper[4594]: I1129 05:53:59.616895 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssk47" event={"ID":"551c161f-3e49-44e1-a572-5c5a33fa37f1","Type":"ContainerStarted","Data":"1ac42def4cb0081bd8a6044ab9cef3c883e5ce3a29687e932f152b3025319f43"} Nov 29 05:54:00 crc kubenswrapper[4594]: I1129 05:54:00.631971 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssk47" event={"ID":"551c161f-3e49-44e1-a572-5c5a33fa37f1","Type":"ContainerStarted","Data":"5de3c648138c409938f971ff5675782d8890be1dd2c61a2bb7658366f73be4c3"} Nov 29 05:54:01 crc kubenswrapper[4594]: I1129 05:54:01.646249 4594 generic.go:334] "Generic (PLEG): container finished" podID="551c161f-3e49-44e1-a572-5c5a33fa37f1" containerID="5de3c648138c409938f971ff5675782d8890be1dd2c61a2bb7658366f73be4c3" exitCode=0 Nov 29 05:54:01 crc kubenswrapper[4594]: I1129 05:54:01.646315 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssk47" event={"ID":"551c161f-3e49-44e1-a572-5c5a33fa37f1","Type":"ContainerDied","Data":"5de3c648138c409938f971ff5675782d8890be1dd2c61a2bb7658366f73be4c3"} Nov 29 05:54:02 crc kubenswrapper[4594]: I1129 05:54:02.658857 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssk47" event={"ID":"551c161f-3e49-44e1-a572-5c5a33fa37f1","Type":"ContainerStarted","Data":"0123aff788e51fa1a321a5b4387e61ff9db7b879e13e0cecb686df07ad0b7a1e"} Nov 29 05:54:02 crc kubenswrapper[4594]: I1129 05:54:02.679741 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ssk47" podStartSLOduration=3.17642388 podStartE2EDuration="5.679724953s" podCreationTimestamp="2025-11-29 05:53:57 +0000 UTC" firstStartedPulling="2025-11-29 05:53:59.619041126 +0000 UTC m=+1563.859550346" lastFinishedPulling="2025-11-29 05:54:02.122342199 +0000 UTC m=+1566.362851419" observedRunningTime="2025-11-29 05:54:02.673139107 +0000 UTC m=+1566.913648327" watchObservedRunningTime="2025-11-29 05:54:02.679724953 +0000 UTC m=+1566.920234173" Nov 29 05:54:03 crc kubenswrapper[4594]: I1129 05:54:03.038430 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-6m48g"] Nov 29 05:54:03 crc kubenswrapper[4594]: I1129 05:54:03.045233 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-6m48g"] Nov 29 05:54:04 crc kubenswrapper[4594]: I1129 05:54:04.024755 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3c5d-account-create-update-kw8mm"] Nov 29 05:54:04 crc kubenswrapper[4594]: I1129 05:54:04.035806 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3c5d-account-create-update-kw8mm"] Nov 29 05:54:04 crc kubenswrapper[4594]: I1129 05:54:04.104205 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c601f48-3079-4e66-8298-65df2d8b111d" path="/var/lib/kubelet/pods/5c601f48-3079-4e66-8298-65df2d8b111d/volumes" Nov 29 05:54:04 crc kubenswrapper[4594]: I1129 05:54:04.105811 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a14827c-5222-4670-bd36-2d0274d1e93e" path="/var/lib/kubelet/pods/7a14827c-5222-4670-bd36-2d0274d1e93e/volumes" Nov 29 05:54:08 crc kubenswrapper[4594]: I1129 05:54:08.292069 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:54:08 crc kubenswrapper[4594]: I1129 05:54:08.292677 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:54:08 crc kubenswrapper[4594]: I1129 05:54:08.330595 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:54:08 crc kubenswrapper[4594]: I1129 05:54:08.362982 4594 scope.go:117] "RemoveContainer" containerID="96f85f3f02d900868dd9211f3d35d1c977c7a18a11beb67169815d70a5d29302" Nov 29 05:54:08 crc kubenswrapper[4594]: I1129 05:54:08.404492 4594 scope.go:117] "RemoveContainer" containerID="eb33a743456d3042e4c9203629fa25282cf037e46fb47fbac94737b25b77d678" Nov 29 05:54:08 crc kubenswrapper[4594]: I1129 05:54:08.430904 4594 scope.go:117] "RemoveContainer" containerID="b6af7a25841c353e99beb65fa3333280ff18afcee460c1d7fda0fb4e69b7bc12" Nov 29 05:54:08 crc kubenswrapper[4594]: I1129 05:54:08.456622 4594 scope.go:117] "RemoveContainer" containerID="6f1a81abf856d1f61c9191f30fa0254a7b7fe26265d4c275d8b539d11a915435" Nov 29 05:54:08 crc kubenswrapper[4594]: I1129 05:54:08.485667 4594 scope.go:117] "RemoveContainer" containerID="a4b2825001eb7b9000e1433443f55cfa355772df65f6d82a1930b3b8c85779ee" Nov 29 05:54:08 crc kubenswrapper[4594]: I1129 05:54:08.516400 4594 scope.go:117] "RemoveContainer" containerID="0baf7d4a765c1131d79936e097efbb8d4f9b563431f42950c46c7b5e6cf3c452" Nov 29 05:54:08 crc kubenswrapper[4594]: I1129 05:54:08.579602 4594 scope.go:117] "RemoveContainer" containerID="285a1205f8b1dcbd2ae81b0654c0e36a710c713a1f7696b81fe9bba05eae03d2" Nov 29 05:54:08 crc kubenswrapper[4594]: I1129 05:54:08.597809 4594 scope.go:117] "RemoveContainer" containerID="e169ba37e5cb7287590583b06c50678378f4fda5f488bf446171457fc5381dde" Nov 29 05:54:08 crc kubenswrapper[4594]: I1129 05:54:08.770351 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:54:08 crc kubenswrapper[4594]: I1129 05:54:08.816150 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ssk47"] Nov 29 05:54:10 crc kubenswrapper[4594]: I1129 05:54:10.745220 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ssk47" podUID="551c161f-3e49-44e1-a572-5c5a33fa37f1" containerName="registry-server" containerID="cri-o://0123aff788e51fa1a321a5b4387e61ff9db7b879e13e0cecb686df07ad0b7a1e" gracePeriod=2 Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.136956 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.301150 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551c161f-3e49-44e1-a572-5c5a33fa37f1-catalog-content\") pod \"551c161f-3e49-44e1-a572-5c5a33fa37f1\" (UID: \"551c161f-3e49-44e1-a572-5c5a33fa37f1\") " Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.301418 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqjtw\" (UniqueName: \"kubernetes.io/projected/551c161f-3e49-44e1-a572-5c5a33fa37f1-kube-api-access-lqjtw\") pod \"551c161f-3e49-44e1-a572-5c5a33fa37f1\" (UID: \"551c161f-3e49-44e1-a572-5c5a33fa37f1\") " Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.301445 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551c161f-3e49-44e1-a572-5c5a33fa37f1-utilities\") pod \"551c161f-3e49-44e1-a572-5c5a33fa37f1\" (UID: \"551c161f-3e49-44e1-a572-5c5a33fa37f1\") " Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.302195 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/551c161f-3e49-44e1-a572-5c5a33fa37f1-utilities" (OuterVolumeSpecName: "utilities") pod "551c161f-3e49-44e1-a572-5c5a33fa37f1" (UID: "551c161f-3e49-44e1-a572-5c5a33fa37f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.302855 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551c161f-3e49-44e1-a572-5c5a33fa37f1-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.306338 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/551c161f-3e49-44e1-a572-5c5a33fa37f1-kube-api-access-lqjtw" (OuterVolumeSpecName: "kube-api-access-lqjtw") pod "551c161f-3e49-44e1-a572-5c5a33fa37f1" (UID: "551c161f-3e49-44e1-a572-5c5a33fa37f1"). InnerVolumeSpecName "kube-api-access-lqjtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.339421 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/551c161f-3e49-44e1-a572-5c5a33fa37f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "551c161f-3e49-44e1-a572-5c5a33fa37f1" (UID: "551c161f-3e49-44e1-a572-5c5a33fa37f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.404720 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551c161f-3e49-44e1-a572-5c5a33fa37f1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.404753 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqjtw\" (UniqueName: \"kubernetes.io/projected/551c161f-3e49-44e1-a572-5c5a33fa37f1-kube-api-access-lqjtw\") on node \"crc\" DevicePath \"\"" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.757274 4594 generic.go:334] "Generic (PLEG): container finished" podID="551c161f-3e49-44e1-a572-5c5a33fa37f1" containerID="0123aff788e51fa1a321a5b4387e61ff9db7b879e13e0cecb686df07ad0b7a1e" exitCode=0 Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.757328 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssk47" event={"ID":"551c161f-3e49-44e1-a572-5c5a33fa37f1","Type":"ContainerDied","Data":"0123aff788e51fa1a321a5b4387e61ff9db7b879e13e0cecb686df07ad0b7a1e"} Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.757367 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssk47" event={"ID":"551c161f-3e49-44e1-a572-5c5a33fa37f1","Type":"ContainerDied","Data":"1ac42def4cb0081bd8a6044ab9cef3c883e5ce3a29687e932f152b3025319f43"} Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.757355 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ssk47" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.757385 4594 scope.go:117] "RemoveContainer" containerID="0123aff788e51fa1a321a5b4387e61ff9db7b879e13e0cecb686df07ad0b7a1e" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.782698 4594 scope.go:117] "RemoveContainer" containerID="5de3c648138c409938f971ff5675782d8890be1dd2c61a2bb7658366f73be4c3" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.790333 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ssk47"] Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.799318 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ssk47"] Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.814562 4594 scope.go:117] "RemoveContainer" containerID="a8239548ad836c3cb9d51f52c67cb2030d2498b913dc835167b03a27d877bcd8" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.836467 4594 scope.go:117] "RemoveContainer" containerID="0123aff788e51fa1a321a5b4387e61ff9db7b879e13e0cecb686df07ad0b7a1e" Nov 29 05:54:11 crc kubenswrapper[4594]: E1129 05:54:11.836799 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0123aff788e51fa1a321a5b4387e61ff9db7b879e13e0cecb686df07ad0b7a1e\": container with ID starting with 0123aff788e51fa1a321a5b4387e61ff9db7b879e13e0cecb686df07ad0b7a1e not found: ID does not exist" containerID="0123aff788e51fa1a321a5b4387e61ff9db7b879e13e0cecb686df07ad0b7a1e" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.836840 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0123aff788e51fa1a321a5b4387e61ff9db7b879e13e0cecb686df07ad0b7a1e"} err="failed to get container status \"0123aff788e51fa1a321a5b4387e61ff9db7b879e13e0cecb686df07ad0b7a1e\": rpc error: code = NotFound desc = could not find container \"0123aff788e51fa1a321a5b4387e61ff9db7b879e13e0cecb686df07ad0b7a1e\": container with ID starting with 0123aff788e51fa1a321a5b4387e61ff9db7b879e13e0cecb686df07ad0b7a1e not found: ID does not exist" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.836875 4594 scope.go:117] "RemoveContainer" containerID="5de3c648138c409938f971ff5675782d8890be1dd2c61a2bb7658366f73be4c3" Nov 29 05:54:11 crc kubenswrapper[4594]: E1129 05:54:11.837245 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5de3c648138c409938f971ff5675782d8890be1dd2c61a2bb7658366f73be4c3\": container with ID starting with 5de3c648138c409938f971ff5675782d8890be1dd2c61a2bb7658366f73be4c3 not found: ID does not exist" containerID="5de3c648138c409938f971ff5675782d8890be1dd2c61a2bb7658366f73be4c3" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.837287 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de3c648138c409938f971ff5675782d8890be1dd2c61a2bb7658366f73be4c3"} err="failed to get container status \"5de3c648138c409938f971ff5675782d8890be1dd2c61a2bb7658366f73be4c3\": rpc error: code = NotFound desc = could not find container \"5de3c648138c409938f971ff5675782d8890be1dd2c61a2bb7658366f73be4c3\": container with ID starting with 5de3c648138c409938f971ff5675782d8890be1dd2c61a2bb7658366f73be4c3 not found: ID does not exist" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.837300 4594 scope.go:117] "RemoveContainer" containerID="a8239548ad836c3cb9d51f52c67cb2030d2498b913dc835167b03a27d877bcd8" Nov 29 05:54:11 crc kubenswrapper[4594]: E1129 05:54:11.837569 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8239548ad836c3cb9d51f52c67cb2030d2498b913dc835167b03a27d877bcd8\": container with ID starting with a8239548ad836c3cb9d51f52c67cb2030d2498b913dc835167b03a27d877bcd8 not found: ID does not exist" containerID="a8239548ad836c3cb9d51f52c67cb2030d2498b913dc835167b03a27d877bcd8" Nov 29 05:54:11 crc kubenswrapper[4594]: I1129 05:54:11.837605 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8239548ad836c3cb9d51f52c67cb2030d2498b913dc835167b03a27d877bcd8"} err="failed to get container status \"a8239548ad836c3cb9d51f52c67cb2030d2498b913dc835167b03a27d877bcd8\": rpc error: code = NotFound desc = could not find container \"a8239548ad836c3cb9d51f52c67cb2030d2498b913dc835167b03a27d877bcd8\": container with ID starting with a8239548ad836c3cb9d51f52c67cb2030d2498b913dc835167b03a27d877bcd8 not found: ID does not exist" Nov 29 05:54:12 crc kubenswrapper[4594]: I1129 05:54:12.023291 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-msskz"] Nov 29 05:54:12 crc kubenswrapper[4594]: I1129 05:54:12.029658 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-msskz"] Nov 29 05:54:12 crc kubenswrapper[4594]: I1129 05:54:12.094112 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="111c210f-a9a4-46ea-93ce-e91bae7ef7a9" path="/var/lib/kubelet/pods/111c210f-a9a4-46ea-93ce-e91bae7ef7a9/volumes" Nov 29 05:54:12 crc kubenswrapper[4594]: I1129 05:54:12.094969 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="551c161f-3e49-44e1-a572-5c5a33fa37f1" path="/var/lib/kubelet/pods/551c161f-3e49-44e1-a572-5c5a33fa37f1/volumes" Nov 29 05:54:13 crc kubenswrapper[4594]: I1129 05:54:13.030093 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d3a2-account-create-update-qmq8q"] Nov 29 05:54:13 crc kubenswrapper[4594]: I1129 05:54:13.040990 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b5e5-account-create-update-tqj9s"] Nov 29 05:54:13 crc kubenswrapper[4594]: I1129 05:54:13.050373 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2r9dc"] Nov 29 05:54:13 crc kubenswrapper[4594]: I1129 05:54:13.056384 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d3a2-account-create-update-qmq8q"] Nov 29 05:54:13 crc kubenswrapper[4594]: I1129 05:54:13.062493 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-5lm4l"] Nov 29 05:54:13 crc kubenswrapper[4594]: I1129 05:54:13.068571 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-90ef-account-create-update-lmtdj"] Nov 29 05:54:13 crc kubenswrapper[4594]: I1129 05:54:13.074251 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2r9dc"] Nov 29 05:54:13 crc kubenswrapper[4594]: I1129 05:54:13.079880 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b5e5-account-create-update-tqj9s"] Nov 29 05:54:13 crc kubenswrapper[4594]: I1129 05:54:13.085796 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-5lm4l"] Nov 29 05:54:13 crc kubenswrapper[4594]: I1129 05:54:13.093928 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-90ef-account-create-update-lmtdj"] Nov 29 05:54:14 crc kubenswrapper[4594]: I1129 05:54:14.094333 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11684277-b949-4e98-8e76-e0780cfb4d32" path="/var/lib/kubelet/pods/11684277-b949-4e98-8e76-e0780cfb4d32/volumes" Nov 29 05:54:14 crc kubenswrapper[4594]: I1129 05:54:14.094920 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e1365bf-4433-4641-910f-0d7a92be4fc1" path="/var/lib/kubelet/pods/2e1365bf-4433-4641-910f-0d7a92be4fc1/volumes" Nov 29 05:54:14 crc kubenswrapper[4594]: I1129 05:54:14.095705 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a4a0787-c2fa-43c0-9c7e-98cbdb79134b" path="/var/lib/kubelet/pods/8a4a0787-c2fa-43c0-9c7e-98cbdb79134b/volumes" Nov 29 05:54:14 crc kubenswrapper[4594]: I1129 05:54:14.096483 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb711bc-c899-4a09-b3f3-c1d256da7188" path="/var/lib/kubelet/pods/deb711bc-c899-4a09-b3f3-c1d256da7188/volumes" Nov 29 05:54:14 crc kubenswrapper[4594]: I1129 05:54:14.097496 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee64b887-a184-43e5-85ad-df3e76919044" path="/var/lib/kubelet/pods/ee64b887-a184-43e5-85ad-df3e76919044/volumes" Nov 29 05:54:32 crc kubenswrapper[4594]: I1129 05:54:32.034999 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-67zkn"] Nov 29 05:54:32 crc kubenswrapper[4594]: I1129 05:54:32.045090 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-67zkn"] Nov 29 05:54:32 crc kubenswrapper[4594]: I1129 05:54:32.092722 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2382e225-37e5-4fa9-97df-0d581b97dd01" path="/var/lib/kubelet/pods/2382e225-37e5-4fa9-97df-0d581b97dd01/volumes" Nov 29 05:54:34 crc kubenswrapper[4594]: I1129 05:54:34.025139 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ww7dv"] Nov 29 05:54:34 crc kubenswrapper[4594]: I1129 05:54:34.032509 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ww7dv"] Nov 29 05:54:34 crc kubenswrapper[4594]: I1129 05:54:34.091649 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98e4a7be-f230-4732-9a53-d258bf31954b" path="/var/lib/kubelet/pods/98e4a7be-f230-4732-9a53-d258bf31954b/volumes" Nov 29 05:54:50 crc kubenswrapper[4594]: I1129 05:54:50.081370 4594 generic.go:334] "Generic (PLEG): container finished" podID="6345164d-91bd-47df-b5a6-71f9940c0f15" containerID="3a011c2608ab1efd3b23f39ad827d637bebae30f1b235a933b45eb8d6e9f9273" exitCode=0 Nov 29 05:54:50 crc kubenswrapper[4594]: I1129 05:54:50.081464 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" event={"ID":"6345164d-91bd-47df-b5a6-71f9940c0f15","Type":"ContainerDied","Data":"3a011c2608ab1efd3b23f39ad827d637bebae30f1b235a933b45eb8d6e9f9273"} Nov 29 05:54:51 crc kubenswrapper[4594]: I1129 05:54:51.446697 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" Nov 29 05:54:51 crc kubenswrapper[4594]: I1129 05:54:51.532664 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzqww\" (UniqueName: \"kubernetes.io/projected/6345164d-91bd-47df-b5a6-71f9940c0f15-kube-api-access-kzqww\") pod \"6345164d-91bd-47df-b5a6-71f9940c0f15\" (UID: \"6345164d-91bd-47df-b5a6-71f9940c0f15\") " Nov 29 05:54:51 crc kubenswrapper[4594]: I1129 05:54:51.532891 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6345164d-91bd-47df-b5a6-71f9940c0f15-inventory\") pod \"6345164d-91bd-47df-b5a6-71f9940c0f15\" (UID: \"6345164d-91bd-47df-b5a6-71f9940c0f15\") " Nov 29 05:54:51 crc kubenswrapper[4594]: I1129 05:54:51.532976 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6345164d-91bd-47df-b5a6-71f9940c0f15-ssh-key\") pod \"6345164d-91bd-47df-b5a6-71f9940c0f15\" (UID: \"6345164d-91bd-47df-b5a6-71f9940c0f15\") " Nov 29 05:54:51 crc kubenswrapper[4594]: I1129 05:54:51.537486 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6345164d-91bd-47df-b5a6-71f9940c0f15-kube-api-access-kzqww" (OuterVolumeSpecName: "kube-api-access-kzqww") pod "6345164d-91bd-47df-b5a6-71f9940c0f15" (UID: "6345164d-91bd-47df-b5a6-71f9940c0f15"). InnerVolumeSpecName "kube-api-access-kzqww". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:54:51 crc kubenswrapper[4594]: I1129 05:54:51.555414 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6345164d-91bd-47df-b5a6-71f9940c0f15-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6345164d-91bd-47df-b5a6-71f9940c0f15" (UID: "6345164d-91bd-47df-b5a6-71f9940c0f15"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:54:51 crc kubenswrapper[4594]: I1129 05:54:51.556477 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6345164d-91bd-47df-b5a6-71f9940c0f15-inventory" (OuterVolumeSpecName: "inventory") pod "6345164d-91bd-47df-b5a6-71f9940c0f15" (UID: "6345164d-91bd-47df-b5a6-71f9940c0f15"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:54:51 crc kubenswrapper[4594]: I1129 05:54:51.634604 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6345164d-91bd-47df-b5a6-71f9940c0f15-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:54:51 crc kubenswrapper[4594]: I1129 05:54:51.634633 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzqww\" (UniqueName: \"kubernetes.io/projected/6345164d-91bd-47df-b5a6-71f9940c0f15-kube-api-access-kzqww\") on node \"crc\" DevicePath \"\"" Nov 29 05:54:51 crc kubenswrapper[4594]: I1129 05:54:51.634644 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6345164d-91bd-47df-b5a6-71f9940c0f15-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.024183 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7b4h2"] Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.030874 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7b4h2"] Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.107688 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="968317f6-c4d9-4647-a166-0cadc0fa57f2" path="/var/lib/kubelet/pods/968317f6-c4d9-4647-a166-0cadc0fa57f2/volumes" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.115635 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" event={"ID":"6345164d-91bd-47df-b5a6-71f9940c0f15","Type":"ContainerDied","Data":"bd81791004a317c8ac33d0bd55cfb30d555c1a015b39889dc696a0bd4cf625a0"} Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.115681 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd81791004a317c8ac33d0bd55cfb30d555c1a015b39889dc696a0bd4cf625a0" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.115792 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8fq52" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.169979 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c"] Nov 29 05:54:52 crc kubenswrapper[4594]: E1129 05:54:52.170737 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551c161f-3e49-44e1-a572-5c5a33fa37f1" containerName="extract-content" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.170760 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="551c161f-3e49-44e1-a572-5c5a33fa37f1" containerName="extract-content" Nov 29 05:54:52 crc kubenswrapper[4594]: E1129 05:54:52.170772 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551c161f-3e49-44e1-a572-5c5a33fa37f1" containerName="extract-utilities" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.170779 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="551c161f-3e49-44e1-a572-5c5a33fa37f1" containerName="extract-utilities" Nov 29 05:54:52 crc kubenswrapper[4594]: E1129 05:54:52.170790 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551c161f-3e49-44e1-a572-5c5a33fa37f1" containerName="registry-server" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.170795 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="551c161f-3e49-44e1-a572-5c5a33fa37f1" containerName="registry-server" Nov 29 05:54:52 crc kubenswrapper[4594]: E1129 05:54:52.170817 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6345164d-91bd-47df-b5a6-71f9940c0f15" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.170823 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="6345164d-91bd-47df-b5a6-71f9940c0f15" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.171068 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="551c161f-3e49-44e1-a572-5c5a33fa37f1" containerName="registry-server" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.171085 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="6345164d-91bd-47df-b5a6-71f9940c0f15" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.172005 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.177506 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.177568 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.177734 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.179471 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.196333 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c"] Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.245217 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4ac87df-2d62-4571-a38a-a9cd25537685-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c\" (UID: \"e4ac87df-2d62-4571-a38a-a9cd25537685\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.245426 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sflcp\" (UniqueName: \"kubernetes.io/projected/e4ac87df-2d62-4571-a38a-a9cd25537685-kube-api-access-sflcp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c\" (UID: \"e4ac87df-2d62-4571-a38a-a9cd25537685\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.245612 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4ac87df-2d62-4571-a38a-a9cd25537685-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c\" (UID: \"e4ac87df-2d62-4571-a38a-a9cd25537685\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.347602 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4ac87df-2d62-4571-a38a-a9cd25537685-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c\" (UID: \"e4ac87df-2d62-4571-a38a-a9cd25537685\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.347767 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sflcp\" (UniqueName: \"kubernetes.io/projected/e4ac87df-2d62-4571-a38a-a9cd25537685-kube-api-access-sflcp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c\" (UID: \"e4ac87df-2d62-4571-a38a-a9cd25537685\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.347975 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4ac87df-2d62-4571-a38a-a9cd25537685-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c\" (UID: \"e4ac87df-2d62-4571-a38a-a9cd25537685\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.352107 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4ac87df-2d62-4571-a38a-a9cd25537685-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c\" (UID: \"e4ac87df-2d62-4571-a38a-a9cd25537685\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.352875 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4ac87df-2d62-4571-a38a-a9cd25537685-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c\" (UID: \"e4ac87df-2d62-4571-a38a-a9cd25537685\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.362363 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sflcp\" (UniqueName: \"kubernetes.io/projected/e4ac87df-2d62-4571-a38a-a9cd25537685-kube-api-access-sflcp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c\" (UID: \"e4ac87df-2d62-4571-a38a-a9cd25537685\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.489542 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" Nov 29 05:54:52 crc kubenswrapper[4594]: I1129 05:54:52.965137 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c"] Nov 29 05:54:53 crc kubenswrapper[4594]: I1129 05:54:53.125226 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" event={"ID":"e4ac87df-2d62-4571-a38a-a9cd25537685","Type":"ContainerStarted","Data":"f91af724f4370209d40cb14d02b1878482c933b4cb6015d9eb59181168813e95"} Nov 29 05:54:54 crc kubenswrapper[4594]: I1129 05:54:54.135045 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" event={"ID":"e4ac87df-2d62-4571-a38a-a9cd25537685","Type":"ContainerStarted","Data":"efd5b9d3fbb27bba6afa3d6625eb4410b97a8486003cadeca3870feffd1f5eca"} Nov 29 05:54:54 crc kubenswrapper[4594]: I1129 05:54:54.152921 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" podStartSLOduration=1.4856254230000001 podStartE2EDuration="2.152901127s" podCreationTimestamp="2025-11-29 05:54:52 +0000 UTC" firstStartedPulling="2025-11-29 05:54:52.96582631 +0000 UTC m=+1617.206335530" lastFinishedPulling="2025-11-29 05:54:53.633102014 +0000 UTC m=+1617.873611234" observedRunningTime="2025-11-29 05:54:54.148764599 +0000 UTC m=+1618.389273818" watchObservedRunningTime="2025-11-29 05:54:54.152901127 +0000 UTC m=+1618.393410347" Nov 29 05:55:03 crc kubenswrapper[4594]: I1129 05:55:03.033888 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-bg92f"] Nov 29 05:55:03 crc kubenswrapper[4594]: I1129 05:55:03.042362 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-bg92f"] Nov 29 05:55:04 crc kubenswrapper[4594]: I1129 05:55:04.092469 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7474ba72-1534-4193-9590-e46dfc840403" path="/var/lib/kubelet/pods/7474ba72-1534-4193-9590-e46dfc840403/volumes" Nov 29 05:55:08 crc kubenswrapper[4594]: I1129 05:55:08.753281 4594 scope.go:117] "RemoveContainer" containerID="bfeedd11925954273a192d98a96f9d06d46fddf7be21d81d23c71f509dca88ee" Nov 29 05:55:08 crc kubenswrapper[4594]: I1129 05:55:08.786435 4594 scope.go:117] "RemoveContainer" containerID="fa4d2c563b6a5c3ba1aea016672f64a1203e1046e169fdafa873d6718fac82e8" Nov 29 05:55:08 crc kubenswrapper[4594]: I1129 05:55:08.841944 4594 scope.go:117] "RemoveContainer" containerID="d3464520115e406ceb8d1ec77ff748787aba518899526e3b001a33e014afc141" Nov 29 05:55:08 crc kubenswrapper[4594]: I1129 05:55:08.862653 4594 scope.go:117] "RemoveContainer" containerID="785d96da8704d2a5416ed7f48a6dd28009ebc072530965f5458eec8b998ca2ce" Nov 29 05:55:08 crc kubenswrapper[4594]: I1129 05:55:08.895297 4594 scope.go:117] "RemoveContainer" containerID="420c1c005e74da6ed5d7566f41328c8fed77e3abf21045a59554138235cc373e" Nov 29 05:55:08 crc kubenswrapper[4594]: I1129 05:55:08.944312 4594 scope.go:117] "RemoveContainer" containerID="118b5f8d8df52bab944cb59fc5a9529b4e4bb11e7b7fbd286616aaf7d6fa5c2b" Nov 29 05:55:08 crc kubenswrapper[4594]: I1129 05:55:08.964092 4594 scope.go:117] "RemoveContainer" containerID="07b381f8ed848bd7bd94f032034b8bfeafd332fe758a3e846ed44b735092d57c" Nov 29 05:55:08 crc kubenswrapper[4594]: I1129 05:55:08.982824 4594 scope.go:117] "RemoveContainer" containerID="d30fbe70e618e541993ac133419e99711234f48de67a5c7d4638e5870e351808" Nov 29 05:55:09 crc kubenswrapper[4594]: I1129 05:55:09.010088 4594 scope.go:117] "RemoveContainer" containerID="8f5ad23e2667ade8b9fce594fbe7e024ea939d2657932f8a218eb38cb301ee46" Nov 29 05:55:09 crc kubenswrapper[4594]: I1129 05:55:09.031525 4594 scope.go:117] "RemoveContainer" containerID="71e9d608f105a521b8ad0af0cd233ee11eaf2e5e790481f8a6cb2dda0b9c95a3" Nov 29 05:55:15 crc kubenswrapper[4594]: I1129 05:55:15.027021 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-mvvfl"] Nov 29 05:55:15 crc kubenswrapper[4594]: I1129 05:55:15.035922 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-mvvfl"] Nov 29 05:55:15 crc kubenswrapper[4594]: I1129 05:55:15.800320 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:55:15 crc kubenswrapper[4594]: I1129 05:55:15.800404 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:55:16 crc kubenswrapper[4594]: I1129 05:55:16.093969 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72edd67c-f301-4e33-8d71-74f9bc7b99c5" path="/var/lib/kubelet/pods/72edd67c-f301-4e33-8d71-74f9bc7b99c5/volumes" Nov 29 05:55:19 crc kubenswrapper[4594]: I1129 05:55:19.032836 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8jkv9"] Nov 29 05:55:19 crc kubenswrapper[4594]: I1129 05:55:19.042090 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gkrjf"] Nov 29 05:55:19 crc kubenswrapper[4594]: I1129 05:55:19.052167 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8jkv9"] Nov 29 05:55:19 crc kubenswrapper[4594]: I1129 05:55:19.060030 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gkrjf"] Nov 29 05:55:20 crc kubenswrapper[4594]: I1129 05:55:20.094874 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69498abf-8b80-4b7f-901a-4c6a4bdede2f" path="/var/lib/kubelet/pods/69498abf-8b80-4b7f-901a-4c6a4bdede2f/volumes" Nov 29 05:55:20 crc kubenswrapper[4594]: I1129 05:55:20.095516 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7cc6b30-a981-4486-9d8a-e926167f001b" path="/var/lib/kubelet/pods/d7cc6b30-a981-4486-9d8a-e926167f001b/volumes" Nov 29 05:55:31 crc kubenswrapper[4594]: I1129 05:55:31.027857 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9rddc"] Nov 29 05:55:31 crc kubenswrapper[4594]: I1129 05:55:31.039353 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9rddc"] Nov 29 05:55:32 crc kubenswrapper[4594]: I1129 05:55:32.093071 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b46224c-7874-4c4a-abb6-f1cbef3a8462" path="/var/lib/kubelet/pods/4b46224c-7874-4c4a-abb6-f1cbef3a8462/volumes" Nov 29 05:55:45 crc kubenswrapper[4594]: I1129 05:55:45.799980 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:55:45 crc kubenswrapper[4594]: I1129 05:55:45.800615 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:55:47 crc kubenswrapper[4594]: I1129 05:55:47.647736 4594 generic.go:334] "Generic (PLEG): container finished" podID="e4ac87df-2d62-4571-a38a-a9cd25537685" containerID="efd5b9d3fbb27bba6afa3d6625eb4410b97a8486003cadeca3870feffd1f5eca" exitCode=0 Nov 29 05:55:47 crc kubenswrapper[4594]: I1129 05:55:47.647822 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" event={"ID":"e4ac87df-2d62-4571-a38a-a9cd25537685","Type":"ContainerDied","Data":"efd5b9d3fbb27bba6afa3d6625eb4410b97a8486003cadeca3870feffd1f5eca"} Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.003817 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.137971 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4ac87df-2d62-4571-a38a-a9cd25537685-inventory\") pod \"e4ac87df-2d62-4571-a38a-a9cd25537685\" (UID: \"e4ac87df-2d62-4571-a38a-a9cd25537685\") " Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.138188 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4ac87df-2d62-4571-a38a-a9cd25537685-ssh-key\") pod \"e4ac87df-2d62-4571-a38a-a9cd25537685\" (UID: \"e4ac87df-2d62-4571-a38a-a9cd25537685\") " Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.138374 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sflcp\" (UniqueName: \"kubernetes.io/projected/e4ac87df-2d62-4571-a38a-a9cd25537685-kube-api-access-sflcp\") pod \"e4ac87df-2d62-4571-a38a-a9cd25537685\" (UID: \"e4ac87df-2d62-4571-a38a-a9cd25537685\") " Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.145655 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ac87df-2d62-4571-a38a-a9cd25537685-kube-api-access-sflcp" (OuterVolumeSpecName: "kube-api-access-sflcp") pod "e4ac87df-2d62-4571-a38a-a9cd25537685" (UID: "e4ac87df-2d62-4571-a38a-a9cd25537685"). InnerVolumeSpecName "kube-api-access-sflcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.162573 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ac87df-2d62-4571-a38a-a9cd25537685-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e4ac87df-2d62-4571-a38a-a9cd25537685" (UID: "e4ac87df-2d62-4571-a38a-a9cd25537685"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.167735 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ac87df-2d62-4571-a38a-a9cd25537685-inventory" (OuterVolumeSpecName: "inventory") pod "e4ac87df-2d62-4571-a38a-a9cd25537685" (UID: "e4ac87df-2d62-4571-a38a-a9cd25537685"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.240813 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sflcp\" (UniqueName: \"kubernetes.io/projected/e4ac87df-2d62-4571-a38a-a9cd25537685-kube-api-access-sflcp\") on node \"crc\" DevicePath \"\"" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.240847 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4ac87df-2d62-4571-a38a-a9cd25537685-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.240857 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4ac87df-2d62-4571-a38a-a9cd25537685-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.684338 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" event={"ID":"e4ac87df-2d62-4571-a38a-a9cd25537685","Type":"ContainerDied","Data":"f91af724f4370209d40cb14d02b1878482c933b4cb6015d9eb59181168813e95"} Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.684387 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f91af724f4370209d40cb14d02b1878482c933b4cb6015d9eb59181168813e95" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.684475 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.740353 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh"] Nov 29 05:55:49 crc kubenswrapper[4594]: E1129 05:55:49.740874 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ac87df-2d62-4571-a38a-a9cd25537685" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.740889 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ac87df-2d62-4571-a38a-a9cd25537685" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.741164 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ac87df-2d62-4571-a38a-a9cd25537685" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.742123 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.745022 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.745287 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.745879 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.746005 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.749120 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh"] Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.750725 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/627ba28b-319a-4072-bd92-d9b1a9e77283-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh\" (UID: \"627ba28b-319a-4072-bd92-d9b1a9e77283\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.750979 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/627ba28b-319a-4072-bd92-d9b1a9e77283-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh\" (UID: \"627ba28b-319a-4072-bd92-d9b1a9e77283\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.751045 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42c58\" (UniqueName: \"kubernetes.io/projected/627ba28b-319a-4072-bd92-d9b1a9e77283-kube-api-access-42c58\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh\" (UID: \"627ba28b-319a-4072-bd92-d9b1a9e77283\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.852517 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/627ba28b-319a-4072-bd92-d9b1a9e77283-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh\" (UID: \"627ba28b-319a-4072-bd92-d9b1a9e77283\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.852681 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/627ba28b-319a-4072-bd92-d9b1a9e77283-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh\" (UID: \"627ba28b-319a-4072-bd92-d9b1a9e77283\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.852725 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42c58\" (UniqueName: \"kubernetes.io/projected/627ba28b-319a-4072-bd92-d9b1a9e77283-kube-api-access-42c58\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh\" (UID: \"627ba28b-319a-4072-bd92-d9b1a9e77283\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.857433 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/627ba28b-319a-4072-bd92-d9b1a9e77283-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh\" (UID: \"627ba28b-319a-4072-bd92-d9b1a9e77283\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.857720 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/627ba28b-319a-4072-bd92-d9b1a9e77283-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh\" (UID: \"627ba28b-319a-4072-bd92-d9b1a9e77283\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" Nov 29 05:55:49 crc kubenswrapper[4594]: I1129 05:55:49.866900 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42c58\" (UniqueName: \"kubernetes.io/projected/627ba28b-319a-4072-bd92-d9b1a9e77283-kube-api-access-42c58\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh\" (UID: \"627ba28b-319a-4072-bd92-d9b1a9e77283\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" Nov 29 05:55:50 crc kubenswrapper[4594]: I1129 05:55:50.062627 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" Nov 29 05:55:50 crc kubenswrapper[4594]: I1129 05:55:50.531945 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh"] Nov 29 05:55:50 crc kubenswrapper[4594]: W1129 05:55:50.535934 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod627ba28b_319a_4072_bd92_d9b1a9e77283.slice/crio-2996e839b4ae90cad76dedfaabd8b4278f771e0c8b545217cc11e5f479acdbac WatchSource:0}: Error finding container 2996e839b4ae90cad76dedfaabd8b4278f771e0c8b545217cc11e5f479acdbac: Status 404 returned error can't find the container with id 2996e839b4ae90cad76dedfaabd8b4278f771e0c8b545217cc11e5f479acdbac Nov 29 05:55:50 crc kubenswrapper[4594]: I1129 05:55:50.698760 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" event={"ID":"627ba28b-319a-4072-bd92-d9b1a9e77283","Type":"ContainerStarted","Data":"2996e839b4ae90cad76dedfaabd8b4278f771e0c8b545217cc11e5f479acdbac"} Nov 29 05:55:51 crc kubenswrapper[4594]: I1129 05:55:51.709266 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" event={"ID":"627ba28b-319a-4072-bd92-d9b1a9e77283","Type":"ContainerStarted","Data":"7e4234be6430fead41664e021542f64f51aa99ed8b5eb912c142526d6e94351d"} Nov 29 05:55:51 crc kubenswrapper[4594]: I1129 05:55:51.730226 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" podStartSLOduration=2.21410623 podStartE2EDuration="2.730205278s" podCreationTimestamp="2025-11-29 05:55:49 +0000 UTC" firstStartedPulling="2025-11-29 05:55:50.538618399 +0000 UTC m=+1674.779127620" lastFinishedPulling="2025-11-29 05:55:51.054717447 +0000 UTC m=+1675.295226668" observedRunningTime="2025-11-29 05:55:51.726465815 +0000 UTC m=+1675.966975036" watchObservedRunningTime="2025-11-29 05:55:51.730205278 +0000 UTC m=+1675.970714498" Nov 29 05:55:55 crc kubenswrapper[4594]: E1129 05:55:55.159044 4594 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod627ba28b_319a_4072_bd92_d9b1a9e77283.slice/crio-conmon-7e4234be6430fead41664e021542f64f51aa99ed8b5eb912c142526d6e94351d.scope\": RecentStats: unable to find data in memory cache]" Nov 29 05:55:55 crc kubenswrapper[4594]: I1129 05:55:55.760623 4594 generic.go:334] "Generic (PLEG): container finished" podID="627ba28b-319a-4072-bd92-d9b1a9e77283" containerID="7e4234be6430fead41664e021542f64f51aa99ed8b5eb912c142526d6e94351d" exitCode=0 Nov 29 05:55:55 crc kubenswrapper[4594]: I1129 05:55:55.760703 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" event={"ID":"627ba28b-319a-4072-bd92-d9b1a9e77283","Type":"ContainerDied","Data":"7e4234be6430fead41664e021542f64f51aa99ed8b5eb912c142526d6e94351d"} Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.121265 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.227617 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/627ba28b-319a-4072-bd92-d9b1a9e77283-ssh-key\") pod \"627ba28b-319a-4072-bd92-d9b1a9e77283\" (UID: \"627ba28b-319a-4072-bd92-d9b1a9e77283\") " Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.227795 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/627ba28b-319a-4072-bd92-d9b1a9e77283-inventory\") pod \"627ba28b-319a-4072-bd92-d9b1a9e77283\" (UID: \"627ba28b-319a-4072-bd92-d9b1a9e77283\") " Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.227960 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42c58\" (UniqueName: \"kubernetes.io/projected/627ba28b-319a-4072-bd92-d9b1a9e77283-kube-api-access-42c58\") pod \"627ba28b-319a-4072-bd92-d9b1a9e77283\" (UID: \"627ba28b-319a-4072-bd92-d9b1a9e77283\") " Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.233377 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627ba28b-319a-4072-bd92-d9b1a9e77283-kube-api-access-42c58" (OuterVolumeSpecName: "kube-api-access-42c58") pod "627ba28b-319a-4072-bd92-d9b1a9e77283" (UID: "627ba28b-319a-4072-bd92-d9b1a9e77283"). InnerVolumeSpecName "kube-api-access-42c58". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.254560 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627ba28b-319a-4072-bd92-d9b1a9e77283-inventory" (OuterVolumeSpecName: "inventory") pod "627ba28b-319a-4072-bd92-d9b1a9e77283" (UID: "627ba28b-319a-4072-bd92-d9b1a9e77283"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.258093 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627ba28b-319a-4072-bd92-d9b1a9e77283-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "627ba28b-319a-4072-bd92-d9b1a9e77283" (UID: "627ba28b-319a-4072-bd92-d9b1a9e77283"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.333346 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/627ba28b-319a-4072-bd92-d9b1a9e77283-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.333387 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42c58\" (UniqueName: \"kubernetes.io/projected/627ba28b-319a-4072-bd92-d9b1a9e77283-kube-api-access-42c58\") on node \"crc\" DevicePath \"\"" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.333401 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/627ba28b-319a-4072-bd92-d9b1a9e77283-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.786183 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" event={"ID":"627ba28b-319a-4072-bd92-d9b1a9e77283","Type":"ContainerDied","Data":"2996e839b4ae90cad76dedfaabd8b4278f771e0c8b545217cc11e5f479acdbac"} Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.786595 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2996e839b4ae90cad76dedfaabd8b4278f771e0c8b545217cc11e5f479acdbac" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.786322 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.862319 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb"] Nov 29 05:55:57 crc kubenswrapper[4594]: E1129 05:55:57.862784 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627ba28b-319a-4072-bd92-d9b1a9e77283" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.862807 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="627ba28b-319a-4072-bd92-d9b1a9e77283" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.863054 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="627ba28b-319a-4072-bd92-d9b1a9e77283" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.863825 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.881484 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.881633 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.881756 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.882152 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.885650 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb"] Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.949223 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wnmx\" (UniqueName: \"kubernetes.io/projected/692e39b7-fc9f-4770-847e-ff968ddf1ad8-kube-api-access-5wnmx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qthxb\" (UID: \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.949792 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/692e39b7-fc9f-4770-847e-ff968ddf1ad8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qthxb\" (UID: \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" Nov 29 05:55:57 crc kubenswrapper[4594]: I1129 05:55:57.950398 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/692e39b7-fc9f-4770-847e-ff968ddf1ad8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qthxb\" (UID: \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" Nov 29 05:55:58 crc kubenswrapper[4594]: I1129 05:55:58.052033 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wnmx\" (UniqueName: \"kubernetes.io/projected/692e39b7-fc9f-4770-847e-ff968ddf1ad8-kube-api-access-5wnmx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qthxb\" (UID: \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" Nov 29 05:55:58 crc kubenswrapper[4594]: I1129 05:55:58.052157 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/692e39b7-fc9f-4770-847e-ff968ddf1ad8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qthxb\" (UID: \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" Nov 29 05:55:58 crc kubenswrapper[4594]: I1129 05:55:58.052560 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/692e39b7-fc9f-4770-847e-ff968ddf1ad8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qthxb\" (UID: \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" Nov 29 05:55:58 crc kubenswrapper[4594]: I1129 05:55:58.056949 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/692e39b7-fc9f-4770-847e-ff968ddf1ad8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qthxb\" (UID: \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" Nov 29 05:55:58 crc kubenswrapper[4594]: I1129 05:55:58.058496 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/692e39b7-fc9f-4770-847e-ff968ddf1ad8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qthxb\" (UID: \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" Nov 29 05:55:58 crc kubenswrapper[4594]: I1129 05:55:58.075827 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wnmx\" (UniqueName: \"kubernetes.io/projected/692e39b7-fc9f-4770-847e-ff968ddf1ad8-kube-api-access-5wnmx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qthxb\" (UID: \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" Nov 29 05:55:58 crc kubenswrapper[4594]: I1129 05:55:58.199678 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" Nov 29 05:55:58 crc kubenswrapper[4594]: I1129 05:55:58.785691 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb"] Nov 29 05:55:58 crc kubenswrapper[4594]: W1129 05:55:58.797159 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod692e39b7_fc9f_4770_847e_ff968ddf1ad8.slice/crio-016731525a1d30381b9e1da518450eb36724ae0f4d4a3e8de283c7d267130a80 WatchSource:0}: Error finding container 016731525a1d30381b9e1da518450eb36724ae0f4d4a3e8de283c7d267130a80: Status 404 returned error can't find the container with id 016731525a1d30381b9e1da518450eb36724ae0f4d4a3e8de283c7d267130a80 Nov 29 05:55:59 crc kubenswrapper[4594]: I1129 05:55:59.807423 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" event={"ID":"692e39b7-fc9f-4770-847e-ff968ddf1ad8","Type":"ContainerStarted","Data":"5698abd70a326c485956b9327684c5fbcc3a60fd76c354f299b9046fb872d4d5"} Nov 29 05:55:59 crc kubenswrapper[4594]: I1129 05:55:59.808132 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" event={"ID":"692e39b7-fc9f-4770-847e-ff968ddf1ad8","Type":"ContainerStarted","Data":"016731525a1d30381b9e1da518450eb36724ae0f4d4a3e8de283c7d267130a80"} Nov 29 05:55:59 crc kubenswrapper[4594]: I1129 05:55:59.830871 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" podStartSLOduration=2.212053359 podStartE2EDuration="2.830853814s" podCreationTimestamp="2025-11-29 05:55:57 +0000 UTC" firstStartedPulling="2025-11-29 05:55:58.799754384 +0000 UTC m=+1683.040263603" lastFinishedPulling="2025-11-29 05:55:59.418554838 +0000 UTC m=+1683.659064058" observedRunningTime="2025-11-29 05:55:59.821912628 +0000 UTC m=+1684.062421848" watchObservedRunningTime="2025-11-29 05:55:59.830853814 +0000 UTC m=+1684.071363034" Nov 29 05:56:09 crc kubenswrapper[4594]: I1129 05:56:09.202918 4594 scope.go:117] "RemoveContainer" containerID="4f378e96290122a024a00bc092e3a9d370ef7e8823da038de4d436c3e6442832" Nov 29 05:56:09 crc kubenswrapper[4594]: I1129 05:56:09.234083 4594 scope.go:117] "RemoveContainer" containerID="1d568eb8477a4f4096f144cd479a748b8bc8e9654f83a8c2a157e58825d2cdfe" Nov 29 05:56:09 crc kubenswrapper[4594]: I1129 05:56:09.287027 4594 scope.go:117] "RemoveContainer" containerID="e8e19a6129ae5d861b384fa3d50896c39d975243eddae5b7537018fd968c04f2" Nov 29 05:56:09 crc kubenswrapper[4594]: I1129 05:56:09.344829 4594 scope.go:117] "RemoveContainer" containerID="ff5b256588941b410fcff611cfa2fc2fb38f9bc1742bbecf88bf769aa253ad0b" Nov 29 05:56:11 crc kubenswrapper[4594]: I1129 05:56:11.046038 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6990-account-create-update-w9h2r"] Nov 29 05:56:11 crc kubenswrapper[4594]: I1129 05:56:11.057357 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-pj4xh"] Nov 29 05:56:11 crc kubenswrapper[4594]: I1129 05:56:11.064692 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-dmbbc"] Nov 29 05:56:11 crc kubenswrapper[4594]: I1129 05:56:11.072398 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6990-account-create-update-w9h2r"] Nov 29 05:56:11 crc kubenswrapper[4594]: I1129 05:56:11.079115 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-pj4xh"] Nov 29 05:56:11 crc kubenswrapper[4594]: I1129 05:56:11.084684 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-7b8a-account-create-update-r8lrn"] Nov 29 05:56:11 crc kubenswrapper[4594]: I1129 05:56:11.089895 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-2btnd"] Nov 29 05:56:11 crc kubenswrapper[4594]: I1129 05:56:11.095112 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b707-account-create-update-x94hq"] Nov 29 05:56:11 crc kubenswrapper[4594]: I1129 05:56:11.100239 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-7b8a-account-create-update-r8lrn"] Nov 29 05:56:11 crc kubenswrapper[4594]: I1129 05:56:11.105313 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-dmbbc"] Nov 29 05:56:11 crc kubenswrapper[4594]: I1129 05:56:11.110527 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-2btnd"] Nov 29 05:56:11 crc kubenswrapper[4594]: I1129 05:56:11.115884 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b707-account-create-update-x94hq"] Nov 29 05:56:12 crc kubenswrapper[4594]: I1129 05:56:12.094751 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2abc53c5-05f4-457c-be49-c34e962a9522" path="/var/lib/kubelet/pods/2abc53c5-05f4-457c-be49-c34e962a9522/volumes" Nov 29 05:56:12 crc kubenswrapper[4594]: I1129 05:56:12.095358 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3371a852-7321-4c19-896c-a31265ebf283" path="/var/lib/kubelet/pods/3371a852-7321-4c19-896c-a31265ebf283/volumes" Nov 29 05:56:12 crc kubenswrapper[4594]: I1129 05:56:12.096436 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f3d7959-7013-4279-8173-077ed6cefbda" path="/var/lib/kubelet/pods/5f3d7959-7013-4279-8173-077ed6cefbda/volumes" Nov 29 05:56:12 crc kubenswrapper[4594]: I1129 05:56:12.096946 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92b184e0-f52c-47d3-8488-b9e8e8552b1c" path="/var/lib/kubelet/pods/92b184e0-f52c-47d3-8488-b9e8e8552b1c/volumes" Nov 29 05:56:12 crc kubenswrapper[4594]: I1129 05:56:12.097903 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada5c910-bb44-43e9-9589-649f2226ebab" path="/var/lib/kubelet/pods/ada5c910-bb44-43e9-9589-649f2226ebab/volumes" Nov 29 05:56:12 crc kubenswrapper[4594]: I1129 05:56:12.098412 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbfcad00-10e0-4501-b728-449a2a2f03b0" path="/var/lib/kubelet/pods/bbfcad00-10e0-4501-b728-449a2a2f03b0/volumes" Nov 29 05:56:15 crc kubenswrapper[4594]: I1129 05:56:15.800050 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 05:56:15 crc kubenswrapper[4594]: I1129 05:56:15.800749 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 05:56:15 crc kubenswrapper[4594]: I1129 05:56:15.800806 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 05:56:15 crc kubenswrapper[4594]: I1129 05:56:15.801970 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 05:56:15 crc kubenswrapper[4594]: I1129 05:56:15.802036 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" gracePeriod=600 Nov 29 05:56:15 crc kubenswrapper[4594]: E1129 05:56:15.925798 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:56:16 crc kubenswrapper[4594]: I1129 05:56:16.002189 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" exitCode=0 Nov 29 05:56:16 crc kubenswrapper[4594]: I1129 05:56:16.002232 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f"} Nov 29 05:56:16 crc kubenswrapper[4594]: I1129 05:56:16.002290 4594 scope.go:117] "RemoveContainer" containerID="e872a7864afdb97a4545c838292279114599a596d9e2444041e74943a73c1a43" Nov 29 05:56:16 crc kubenswrapper[4594]: I1129 05:56:16.002967 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:56:16 crc kubenswrapper[4594]: E1129 05:56:16.003378 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:56:29 crc kubenswrapper[4594]: I1129 05:56:29.083530 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:56:29 crc kubenswrapper[4594]: E1129 05:56:29.084448 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:56:29 crc kubenswrapper[4594]: I1129 05:56:29.153448 4594 generic.go:334] "Generic (PLEG): container finished" podID="692e39b7-fc9f-4770-847e-ff968ddf1ad8" containerID="5698abd70a326c485956b9327684c5fbcc3a60fd76c354f299b9046fb872d4d5" exitCode=0 Nov 29 05:56:29 crc kubenswrapper[4594]: I1129 05:56:29.153528 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" event={"ID":"692e39b7-fc9f-4770-847e-ff968ddf1ad8","Type":"ContainerDied","Data":"5698abd70a326c485956b9327684c5fbcc3a60fd76c354f299b9046fb872d4d5"} Nov 29 05:56:30 crc kubenswrapper[4594]: I1129 05:56:30.604735 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" Nov 29 05:56:30 crc kubenswrapper[4594]: I1129 05:56:30.615597 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wnmx\" (UniqueName: \"kubernetes.io/projected/692e39b7-fc9f-4770-847e-ff968ddf1ad8-kube-api-access-5wnmx\") pod \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\" (UID: \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\") " Nov 29 05:56:30 crc kubenswrapper[4594]: I1129 05:56:30.615748 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/692e39b7-fc9f-4770-847e-ff968ddf1ad8-inventory\") pod \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\" (UID: \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\") " Nov 29 05:56:30 crc kubenswrapper[4594]: I1129 05:56:30.615928 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/692e39b7-fc9f-4770-847e-ff968ddf1ad8-ssh-key\") pod \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\" (UID: \"692e39b7-fc9f-4770-847e-ff968ddf1ad8\") " Nov 29 05:56:30 crc kubenswrapper[4594]: I1129 05:56:30.629459 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/692e39b7-fc9f-4770-847e-ff968ddf1ad8-kube-api-access-5wnmx" (OuterVolumeSpecName: "kube-api-access-5wnmx") pod "692e39b7-fc9f-4770-847e-ff968ddf1ad8" (UID: "692e39b7-fc9f-4770-847e-ff968ddf1ad8"). InnerVolumeSpecName "kube-api-access-5wnmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:56:30 crc kubenswrapper[4594]: I1129 05:56:30.643151 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/692e39b7-fc9f-4770-847e-ff968ddf1ad8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "692e39b7-fc9f-4770-847e-ff968ddf1ad8" (UID: "692e39b7-fc9f-4770-847e-ff968ddf1ad8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:56:30 crc kubenswrapper[4594]: I1129 05:56:30.645676 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/692e39b7-fc9f-4770-847e-ff968ddf1ad8-inventory" (OuterVolumeSpecName: "inventory") pod "692e39b7-fc9f-4770-847e-ff968ddf1ad8" (UID: "692e39b7-fc9f-4770-847e-ff968ddf1ad8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:56:30 crc kubenswrapper[4594]: I1129 05:56:30.719366 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wnmx\" (UniqueName: \"kubernetes.io/projected/692e39b7-fc9f-4770-847e-ff968ddf1ad8-kube-api-access-5wnmx\") on node \"crc\" DevicePath \"\"" Nov 29 05:56:30 crc kubenswrapper[4594]: I1129 05:56:30.719399 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/692e39b7-fc9f-4770-847e-ff968ddf1ad8-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 05:56:30 crc kubenswrapper[4594]: I1129 05:56:30.719408 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/692e39b7-fc9f-4770-847e-ff968ddf1ad8-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.174477 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" event={"ID":"692e39b7-fc9f-4770-847e-ff968ddf1ad8","Type":"ContainerDied","Data":"016731525a1d30381b9e1da518450eb36724ae0f4d4a3e8de283c7d267130a80"} Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.174776 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="016731525a1d30381b9e1da518450eb36724ae0f4d4a3e8de283c7d267130a80" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.174587 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qthxb" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.263344 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x"] Nov 29 05:56:31 crc kubenswrapper[4594]: E1129 05:56:31.264022 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692e39b7-fc9f-4770-847e-ff968ddf1ad8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.264049 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="692e39b7-fc9f-4770-847e-ff968ddf1ad8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.264426 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="692e39b7-fc9f-4770-847e-ff968ddf1ad8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.265546 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.268032 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.268725 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.270110 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.277333 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x"] Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.279435 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.330875 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2dfr\" (UniqueName: \"kubernetes.io/projected/12960501-d688-4937-b0b3-048b780072d3-kube-api-access-f2dfr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-srf6x\" (UID: \"12960501-d688-4937-b0b3-048b780072d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.330996 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12960501-d688-4937-b0b3-048b780072d3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-srf6x\" (UID: \"12960501-d688-4937-b0b3-048b780072d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.331123 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12960501-d688-4937-b0b3-048b780072d3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-srf6x\" (UID: \"12960501-d688-4937-b0b3-048b780072d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.433783 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12960501-d688-4937-b0b3-048b780072d3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-srf6x\" (UID: \"12960501-d688-4937-b0b3-048b780072d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.434056 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12960501-d688-4937-b0b3-048b780072d3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-srf6x\" (UID: \"12960501-d688-4937-b0b3-048b780072d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.434133 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2dfr\" (UniqueName: \"kubernetes.io/projected/12960501-d688-4937-b0b3-048b780072d3-kube-api-access-f2dfr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-srf6x\" (UID: \"12960501-d688-4937-b0b3-048b780072d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.440557 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12960501-d688-4937-b0b3-048b780072d3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-srf6x\" (UID: \"12960501-d688-4937-b0b3-048b780072d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.440644 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12960501-d688-4937-b0b3-048b780072d3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-srf6x\" (UID: \"12960501-d688-4937-b0b3-048b780072d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.451834 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2dfr\" (UniqueName: \"kubernetes.io/projected/12960501-d688-4937-b0b3-048b780072d3-kube-api-access-f2dfr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-srf6x\" (UID: \"12960501-d688-4937-b0b3-048b780072d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" Nov 29 05:56:31 crc kubenswrapper[4594]: I1129 05:56:31.587397 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" Nov 29 05:56:32 crc kubenswrapper[4594]: I1129 05:56:32.096164 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x"] Nov 29 05:56:32 crc kubenswrapper[4594]: I1129 05:56:32.185737 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" event={"ID":"12960501-d688-4937-b0b3-048b780072d3","Type":"ContainerStarted","Data":"0fc0b947ab7891827e8fe3fa6ada3686a7424ceb9e2daecc76a51a8ba9e80eb7"} Nov 29 05:56:33 crc kubenswrapper[4594]: I1129 05:56:33.197499 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" event={"ID":"12960501-d688-4937-b0b3-048b780072d3","Type":"ContainerStarted","Data":"bbd50dcfc4e6ab79caad91a17355ae464eba533c94c18052b825c34ec04bac8b"} Nov 29 05:56:33 crc kubenswrapper[4594]: I1129 05:56:33.218528 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" podStartSLOduration=1.6044764599999999 podStartE2EDuration="2.218509458s" podCreationTimestamp="2025-11-29 05:56:31 +0000 UTC" firstStartedPulling="2025-11-29 05:56:32.093235739 +0000 UTC m=+1716.333744959" lastFinishedPulling="2025-11-29 05:56:32.707268737 +0000 UTC m=+1716.947777957" observedRunningTime="2025-11-29 05:56:33.214614474 +0000 UTC m=+1717.455123693" watchObservedRunningTime="2025-11-29 05:56:33.218509458 +0000 UTC m=+1717.459018669" Nov 29 05:56:38 crc kubenswrapper[4594]: I1129 05:56:38.038871 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p6cqb"] Nov 29 05:56:38 crc kubenswrapper[4594]: I1129 05:56:38.048532 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p6cqb"] Nov 29 05:56:38 crc kubenswrapper[4594]: I1129 05:56:38.093179 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de4685b4-8bf9-4a38-ba5e-7062994790c8" path="/var/lib/kubelet/pods/de4685b4-8bf9-4a38-ba5e-7062994790c8/volumes" Nov 29 05:56:41 crc kubenswrapper[4594]: I1129 05:56:41.084037 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:56:41 crc kubenswrapper[4594]: E1129 05:56:41.084860 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:56:56 crc kubenswrapper[4594]: I1129 05:56:56.090081 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:56:56 crc kubenswrapper[4594]: E1129 05:56:56.093763 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:56:58 crc kubenswrapper[4594]: I1129 05:56:58.033914 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7mb5g"] Nov 29 05:56:58 crc kubenswrapper[4594]: I1129 05:56:58.041107 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7mb5g"] Nov 29 05:56:58 crc kubenswrapper[4594]: I1129 05:56:58.092504 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e2ee6f-48e6-470c-84c9-6b626fdefef7" path="/var/lib/kubelet/pods/19e2ee6f-48e6-470c-84c9-6b626fdefef7/volumes" Nov 29 05:57:00 crc kubenswrapper[4594]: I1129 05:57:00.042246 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-cxpv7"] Nov 29 05:57:00 crc kubenswrapper[4594]: I1129 05:57:00.051385 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-cxpv7"] Nov 29 05:57:00 crc kubenswrapper[4594]: I1129 05:57:00.095188 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e5dae6-ceb8-4878-a37d-56f97d59d103" path="/var/lib/kubelet/pods/b3e5dae6-ceb8-4878-a37d-56f97d59d103/volumes" Nov 29 05:57:08 crc kubenswrapper[4594]: I1129 05:57:08.084230 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:57:08 crc kubenswrapper[4594]: E1129 05:57:08.084885 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:57:09 crc kubenswrapper[4594]: I1129 05:57:09.458248 4594 scope.go:117] "RemoveContainer" containerID="8d717d82a2740ec4060ca75c87c2a9ea54822b82b1ca4a0b1e192b27dbd9d9a1" Nov 29 05:57:09 crc kubenswrapper[4594]: I1129 05:57:09.482991 4594 scope.go:117] "RemoveContainer" containerID="15a7df3933c5684260924dc998b66227db0e746b52712f9fd0112ed9bceff343" Nov 29 05:57:09 crc kubenswrapper[4594]: I1129 05:57:09.534848 4594 scope.go:117] "RemoveContainer" containerID="edac931bbed8944e474eacc46259cf62ee3a21806a5578f51b5bdcc0546efb57" Nov 29 05:57:09 crc kubenswrapper[4594]: I1129 05:57:09.555098 4594 scope.go:117] "RemoveContainer" containerID="589add5bafc50b7a89058d95a6bae5bc2c81dfd4d598b6a855d23afdc0c03ff1" Nov 29 05:57:09 crc kubenswrapper[4594]: I1129 05:57:09.608762 4594 scope.go:117] "RemoveContainer" containerID="0fbb4b9dd828c88480c72368e61ac2b3e7f03355bfdebce7ee3d3f05d453f2d1" Nov 29 05:57:09 crc kubenswrapper[4594]: I1129 05:57:09.647481 4594 scope.go:117] "RemoveContainer" containerID="276a56701b377bfceb696b0d8e95eaa28edb6a9e217c8abd01583090c35129c8" Nov 29 05:57:09 crc kubenswrapper[4594]: I1129 05:57:09.668414 4594 scope.go:117] "RemoveContainer" containerID="123ee440b039bf270d01cc85c87b7aeed13611fcf6da1c0ca3070508a75c318c" Nov 29 05:57:09 crc kubenswrapper[4594]: I1129 05:57:09.688421 4594 scope.go:117] "RemoveContainer" containerID="256afc9e6233f981bc8a70f9cb1c17c099179d185e2c26760218524be5c1591d" Nov 29 05:57:09 crc kubenswrapper[4594]: I1129 05:57:09.732699 4594 scope.go:117] "RemoveContainer" containerID="f27d8c0edefcd2ff26a6c5af4dd9f23d94369e78e08cd9af2e8f886dfd38bad7" Nov 29 05:57:11 crc kubenswrapper[4594]: I1129 05:57:11.586321 4594 generic.go:334] "Generic (PLEG): container finished" podID="12960501-d688-4937-b0b3-048b780072d3" containerID="bbd50dcfc4e6ab79caad91a17355ae464eba533c94c18052b825c34ec04bac8b" exitCode=0 Nov 29 05:57:11 crc kubenswrapper[4594]: I1129 05:57:11.586631 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" event={"ID":"12960501-d688-4937-b0b3-048b780072d3","Type":"ContainerDied","Data":"bbd50dcfc4e6ab79caad91a17355ae464eba533c94c18052b825c34ec04bac8b"} Nov 29 05:57:12 crc kubenswrapper[4594]: I1129 05:57:12.949463 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.089243 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2dfr\" (UniqueName: \"kubernetes.io/projected/12960501-d688-4937-b0b3-048b780072d3-kube-api-access-f2dfr\") pod \"12960501-d688-4937-b0b3-048b780072d3\" (UID: \"12960501-d688-4937-b0b3-048b780072d3\") " Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.089404 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12960501-d688-4937-b0b3-048b780072d3-ssh-key\") pod \"12960501-d688-4937-b0b3-048b780072d3\" (UID: \"12960501-d688-4937-b0b3-048b780072d3\") " Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.089552 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12960501-d688-4937-b0b3-048b780072d3-inventory\") pod \"12960501-d688-4937-b0b3-048b780072d3\" (UID: \"12960501-d688-4937-b0b3-048b780072d3\") " Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.095877 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12960501-d688-4937-b0b3-048b780072d3-kube-api-access-f2dfr" (OuterVolumeSpecName: "kube-api-access-f2dfr") pod "12960501-d688-4937-b0b3-048b780072d3" (UID: "12960501-d688-4937-b0b3-048b780072d3"). InnerVolumeSpecName "kube-api-access-f2dfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.118458 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12960501-d688-4937-b0b3-048b780072d3-inventory" (OuterVolumeSpecName: "inventory") pod "12960501-d688-4937-b0b3-048b780072d3" (UID: "12960501-d688-4937-b0b3-048b780072d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.122681 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12960501-d688-4937-b0b3-048b780072d3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "12960501-d688-4937-b0b3-048b780072d3" (UID: "12960501-d688-4937-b0b3-048b780072d3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.193645 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2dfr\" (UniqueName: \"kubernetes.io/projected/12960501-d688-4937-b0b3-048b780072d3-kube-api-access-f2dfr\") on node \"crc\" DevicePath \"\"" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.193696 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12960501-d688-4937-b0b3-048b780072d3-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.193709 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12960501-d688-4937-b0b3-048b780072d3-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.610165 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" event={"ID":"12960501-d688-4937-b0b3-048b780072d3","Type":"ContainerDied","Data":"0fc0b947ab7891827e8fe3fa6ada3686a7424ceb9e2daecc76a51a8ba9e80eb7"} Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.610220 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fc0b947ab7891827e8fe3fa6ada3686a7424ceb9e2daecc76a51a8ba9e80eb7" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.610269 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-srf6x" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.675086 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-6gzlk"] Nov 29 05:57:13 crc kubenswrapper[4594]: E1129 05:57:13.675531 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12960501-d688-4937-b0b3-048b780072d3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.675573 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="12960501-d688-4937-b0b3-048b780072d3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.675788 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="12960501-d688-4937-b0b3-048b780072d3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.676491 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.678196 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.678233 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.678520 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.685616 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.690599 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-6gzlk"] Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.703934 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fad852f8-f0f3-4fa6-9196-58c24259a3a6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-6gzlk\" (UID: \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.703981 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fad852f8-f0f3-4fa6-9196-58c24259a3a6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-6gzlk\" (UID: \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.704015 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt57f\" (UniqueName: \"kubernetes.io/projected/fad852f8-f0f3-4fa6-9196-58c24259a3a6-kube-api-access-rt57f\") pod \"ssh-known-hosts-edpm-deployment-6gzlk\" (UID: \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.806945 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fad852f8-f0f3-4fa6-9196-58c24259a3a6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-6gzlk\" (UID: \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.807013 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fad852f8-f0f3-4fa6-9196-58c24259a3a6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-6gzlk\" (UID: \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.807060 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt57f\" (UniqueName: \"kubernetes.io/projected/fad852f8-f0f3-4fa6-9196-58c24259a3a6-kube-api-access-rt57f\") pod \"ssh-known-hosts-edpm-deployment-6gzlk\" (UID: \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.810843 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fad852f8-f0f3-4fa6-9196-58c24259a3a6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-6gzlk\" (UID: \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.811400 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fad852f8-f0f3-4fa6-9196-58c24259a3a6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-6gzlk\" (UID: \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.822973 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt57f\" (UniqueName: \"kubernetes.io/projected/fad852f8-f0f3-4fa6-9196-58c24259a3a6-kube-api-access-rt57f\") pod \"ssh-known-hosts-edpm-deployment-6gzlk\" (UID: \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" Nov 29 05:57:13 crc kubenswrapper[4594]: I1129 05:57:13.990615 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" Nov 29 05:57:14 crc kubenswrapper[4594]: I1129 05:57:14.473536 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-6gzlk"] Nov 29 05:57:14 crc kubenswrapper[4594]: I1129 05:57:14.622578 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" event={"ID":"fad852f8-f0f3-4fa6-9196-58c24259a3a6","Type":"ContainerStarted","Data":"dfc75a2b7fd67b9921a041493527fbc2ae7597cee7263073db19d6124633a936"} Nov 29 05:57:15 crc kubenswrapper[4594]: I1129 05:57:15.635134 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" event={"ID":"fad852f8-f0f3-4fa6-9196-58c24259a3a6","Type":"ContainerStarted","Data":"1887d0bbfdee3d248aea151090f41a46911301a3e107fefdd9c4103ddc172527"} Nov 29 05:57:15 crc kubenswrapper[4594]: I1129 05:57:15.656614 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" podStartSLOduration=1.8737857930000001 podStartE2EDuration="2.656591964s" podCreationTimestamp="2025-11-29 05:57:13 +0000 UTC" firstStartedPulling="2025-11-29 05:57:14.479723317 +0000 UTC m=+1758.720232536" lastFinishedPulling="2025-11-29 05:57:15.262529487 +0000 UTC m=+1759.503038707" observedRunningTime="2025-11-29 05:57:15.648774682 +0000 UTC m=+1759.889283902" watchObservedRunningTime="2025-11-29 05:57:15.656591964 +0000 UTC m=+1759.897101184" Nov 29 05:57:20 crc kubenswrapper[4594]: I1129 05:57:20.690920 4594 generic.go:334] "Generic (PLEG): container finished" podID="fad852f8-f0f3-4fa6-9196-58c24259a3a6" containerID="1887d0bbfdee3d248aea151090f41a46911301a3e107fefdd9c4103ddc172527" exitCode=0 Nov 29 05:57:20 crc kubenswrapper[4594]: I1129 05:57:20.691011 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" event={"ID":"fad852f8-f0f3-4fa6-9196-58c24259a3a6","Type":"ContainerDied","Data":"1887d0bbfdee3d248aea151090f41a46911301a3e107fefdd9c4103ddc172527"} Nov 29 05:57:21 crc kubenswrapper[4594]: I1129 05:57:21.083542 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:57:21 crc kubenswrapper[4594]: E1129 05:57:21.083883 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.039032 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.197966 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fad852f8-f0f3-4fa6-9196-58c24259a3a6-ssh-key-openstack-edpm-ipam\") pod \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\" (UID: \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\") " Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.198562 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fad852f8-f0f3-4fa6-9196-58c24259a3a6-inventory-0\") pod \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\" (UID: \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\") " Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.198823 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt57f\" (UniqueName: \"kubernetes.io/projected/fad852f8-f0f3-4fa6-9196-58c24259a3a6-kube-api-access-rt57f\") pod \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\" (UID: \"fad852f8-f0f3-4fa6-9196-58c24259a3a6\") " Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.205066 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fad852f8-f0f3-4fa6-9196-58c24259a3a6-kube-api-access-rt57f" (OuterVolumeSpecName: "kube-api-access-rt57f") pod "fad852f8-f0f3-4fa6-9196-58c24259a3a6" (UID: "fad852f8-f0f3-4fa6-9196-58c24259a3a6"). InnerVolumeSpecName "kube-api-access-rt57f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.232562 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad852f8-f0f3-4fa6-9196-58c24259a3a6-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "fad852f8-f0f3-4fa6-9196-58c24259a3a6" (UID: "fad852f8-f0f3-4fa6-9196-58c24259a3a6"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.234490 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad852f8-f0f3-4fa6-9196-58c24259a3a6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fad852f8-f0f3-4fa6-9196-58c24259a3a6" (UID: "fad852f8-f0f3-4fa6-9196-58c24259a3a6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.301872 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fad852f8-f0f3-4fa6-9196-58c24259a3a6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.301905 4594 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fad852f8-f0f3-4fa6-9196-58c24259a3a6-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.301917 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt57f\" (UniqueName: \"kubernetes.io/projected/fad852f8-f0f3-4fa6-9196-58c24259a3a6-kube-api-access-rt57f\") on node \"crc\" DevicePath \"\"" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.712416 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" event={"ID":"fad852f8-f0f3-4fa6-9196-58c24259a3a6","Type":"ContainerDied","Data":"dfc75a2b7fd67b9921a041493527fbc2ae7597cee7263073db19d6124633a936"} Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.712708 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfc75a2b7fd67b9921a041493527fbc2ae7597cee7263073db19d6124633a936" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.712487 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6gzlk" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.770196 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm"] Nov 29 05:57:22 crc kubenswrapper[4594]: E1129 05:57:22.770821 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad852f8-f0f3-4fa6-9196-58c24259a3a6" containerName="ssh-known-hosts-edpm-deployment" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.770856 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad852f8-f0f3-4fa6-9196-58c24259a3a6" containerName="ssh-known-hosts-edpm-deployment" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.771099 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad852f8-f0f3-4fa6-9196-58c24259a3a6" containerName="ssh-known-hosts-edpm-deployment" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.771977 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.775665 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.775675 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.776197 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.776281 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.777589 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm"] Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.816916 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8n2z\" (UniqueName: \"kubernetes.io/projected/795787e2-6c07-4e76-98ae-38a13aae294a-kube-api-access-j8n2z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kxwqm\" (UID: \"795787e2-6c07-4e76-98ae-38a13aae294a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.817014 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/795787e2-6c07-4e76-98ae-38a13aae294a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kxwqm\" (UID: \"795787e2-6c07-4e76-98ae-38a13aae294a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.817156 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/795787e2-6c07-4e76-98ae-38a13aae294a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kxwqm\" (UID: \"795787e2-6c07-4e76-98ae-38a13aae294a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.919465 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8n2z\" (UniqueName: \"kubernetes.io/projected/795787e2-6c07-4e76-98ae-38a13aae294a-kube-api-access-j8n2z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kxwqm\" (UID: \"795787e2-6c07-4e76-98ae-38a13aae294a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.919553 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/795787e2-6c07-4e76-98ae-38a13aae294a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kxwqm\" (UID: \"795787e2-6c07-4e76-98ae-38a13aae294a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.919656 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/795787e2-6c07-4e76-98ae-38a13aae294a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kxwqm\" (UID: \"795787e2-6c07-4e76-98ae-38a13aae294a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.923911 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/795787e2-6c07-4e76-98ae-38a13aae294a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kxwqm\" (UID: \"795787e2-6c07-4e76-98ae-38a13aae294a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.923940 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/795787e2-6c07-4e76-98ae-38a13aae294a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kxwqm\" (UID: \"795787e2-6c07-4e76-98ae-38a13aae294a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" Nov 29 05:57:22 crc kubenswrapper[4594]: I1129 05:57:22.935710 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8n2z\" (UniqueName: \"kubernetes.io/projected/795787e2-6c07-4e76-98ae-38a13aae294a-kube-api-access-j8n2z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kxwqm\" (UID: \"795787e2-6c07-4e76-98ae-38a13aae294a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" Nov 29 05:57:23 crc kubenswrapper[4594]: I1129 05:57:23.090757 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" Nov 29 05:57:23 crc kubenswrapper[4594]: I1129 05:57:23.618486 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm"] Nov 29 05:57:23 crc kubenswrapper[4594]: I1129 05:57:23.720720 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" event={"ID":"795787e2-6c07-4e76-98ae-38a13aae294a","Type":"ContainerStarted","Data":"5488ffe1b9f54c9aa457dab0e6d01e6a636b8355e8ad0301b008a14195134e80"} Nov 29 05:57:24 crc kubenswrapper[4594]: I1129 05:57:24.743806 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" event={"ID":"795787e2-6c07-4e76-98ae-38a13aae294a","Type":"ContainerStarted","Data":"eded6b93807cb0a2896f650d348c832d79f55cb5e0dc6acffa6a69ed174edea2"} Nov 29 05:57:24 crc kubenswrapper[4594]: I1129 05:57:24.768714 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" podStartSLOduration=2.258622509 podStartE2EDuration="2.76869326s" podCreationTimestamp="2025-11-29 05:57:22 +0000 UTC" firstStartedPulling="2025-11-29 05:57:23.63279142 +0000 UTC m=+1767.873300641" lastFinishedPulling="2025-11-29 05:57:24.142862171 +0000 UTC m=+1768.383371392" observedRunningTime="2025-11-29 05:57:24.759016811 +0000 UTC m=+1768.999526031" watchObservedRunningTime="2025-11-29 05:57:24.76869326 +0000 UTC m=+1769.009202480" Nov 29 05:57:30 crc kubenswrapper[4594]: I1129 05:57:30.809591 4594 generic.go:334] "Generic (PLEG): container finished" podID="795787e2-6c07-4e76-98ae-38a13aae294a" containerID="eded6b93807cb0a2896f650d348c832d79f55cb5e0dc6acffa6a69ed174edea2" exitCode=0 Nov 29 05:57:30 crc kubenswrapper[4594]: I1129 05:57:30.809813 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" event={"ID":"795787e2-6c07-4e76-98ae-38a13aae294a","Type":"ContainerDied","Data":"eded6b93807cb0a2896f650d348c832d79f55cb5e0dc6acffa6a69ed174edea2"} Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.168841 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.251247 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8n2z\" (UniqueName: \"kubernetes.io/projected/795787e2-6c07-4e76-98ae-38a13aae294a-kube-api-access-j8n2z\") pod \"795787e2-6c07-4e76-98ae-38a13aae294a\" (UID: \"795787e2-6c07-4e76-98ae-38a13aae294a\") " Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.251351 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/795787e2-6c07-4e76-98ae-38a13aae294a-ssh-key\") pod \"795787e2-6c07-4e76-98ae-38a13aae294a\" (UID: \"795787e2-6c07-4e76-98ae-38a13aae294a\") " Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.251429 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/795787e2-6c07-4e76-98ae-38a13aae294a-inventory\") pod \"795787e2-6c07-4e76-98ae-38a13aae294a\" (UID: \"795787e2-6c07-4e76-98ae-38a13aae294a\") " Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.256563 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795787e2-6c07-4e76-98ae-38a13aae294a-kube-api-access-j8n2z" (OuterVolumeSpecName: "kube-api-access-j8n2z") pod "795787e2-6c07-4e76-98ae-38a13aae294a" (UID: "795787e2-6c07-4e76-98ae-38a13aae294a"). InnerVolumeSpecName "kube-api-access-j8n2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.274473 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/795787e2-6c07-4e76-98ae-38a13aae294a-inventory" (OuterVolumeSpecName: "inventory") pod "795787e2-6c07-4e76-98ae-38a13aae294a" (UID: "795787e2-6c07-4e76-98ae-38a13aae294a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.275199 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/795787e2-6c07-4e76-98ae-38a13aae294a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "795787e2-6c07-4e76-98ae-38a13aae294a" (UID: "795787e2-6c07-4e76-98ae-38a13aae294a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.354384 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8n2z\" (UniqueName: \"kubernetes.io/projected/795787e2-6c07-4e76-98ae-38a13aae294a-kube-api-access-j8n2z\") on node \"crc\" DevicePath \"\"" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.354417 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/795787e2-6c07-4e76-98ae-38a13aae294a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.354428 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/795787e2-6c07-4e76-98ae-38a13aae294a-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.834275 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" event={"ID":"795787e2-6c07-4e76-98ae-38a13aae294a","Type":"ContainerDied","Data":"5488ffe1b9f54c9aa457dab0e6d01e6a636b8355e8ad0301b008a14195134e80"} Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.834326 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5488ffe1b9f54c9aa457dab0e6d01e6a636b8355e8ad0301b008a14195134e80" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.834327 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kxwqm" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.895150 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp"] Nov 29 05:57:32 crc kubenswrapper[4594]: E1129 05:57:32.895695 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795787e2-6c07-4e76-98ae-38a13aae294a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.895717 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="795787e2-6c07-4e76-98ae-38a13aae294a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.895968 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="795787e2-6c07-4e76-98ae-38a13aae294a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.896736 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.898483 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.898922 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.898919 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.899912 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.904620 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp"] Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.965083 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vwh6\" (UniqueName: \"kubernetes.io/projected/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-kube-api-access-4vwh6\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp\" (UID: \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.965206 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp\" (UID: \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" Nov 29 05:57:32 crc kubenswrapper[4594]: I1129 05:57:32.965331 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp\" (UID: \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" Nov 29 05:57:33 crc kubenswrapper[4594]: I1129 05:57:33.067029 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp\" (UID: \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" Nov 29 05:57:33 crc kubenswrapper[4594]: I1129 05:57:33.067115 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp\" (UID: \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" Nov 29 05:57:33 crc kubenswrapper[4594]: I1129 05:57:33.067283 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vwh6\" (UniqueName: \"kubernetes.io/projected/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-kube-api-access-4vwh6\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp\" (UID: \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" Nov 29 05:57:33 crc kubenswrapper[4594]: I1129 05:57:33.072221 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp\" (UID: \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" Nov 29 05:57:33 crc kubenswrapper[4594]: I1129 05:57:33.073140 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp\" (UID: \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" Nov 29 05:57:33 crc kubenswrapper[4594]: I1129 05:57:33.081245 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vwh6\" (UniqueName: \"kubernetes.io/projected/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-kube-api-access-4vwh6\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp\" (UID: \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" Nov 29 05:57:33 crc kubenswrapper[4594]: I1129 05:57:33.237514 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" Nov 29 05:57:33 crc kubenswrapper[4594]: I1129 05:57:33.716029 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp"] Nov 29 05:57:33 crc kubenswrapper[4594]: I1129 05:57:33.845428 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" event={"ID":"41f4ce3b-5711-4b51-a22d-5fcbda6153ac","Type":"ContainerStarted","Data":"423d9db8c2226f9d3ecac0ed8c8ccd1a59ef3d7401786e70bceda669b3329ded"} Nov 29 05:57:34 crc kubenswrapper[4594]: I1129 05:57:34.084506 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:57:34 crc kubenswrapper[4594]: E1129 05:57:34.084799 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:57:34 crc kubenswrapper[4594]: I1129 05:57:34.857474 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" event={"ID":"41f4ce3b-5711-4b51-a22d-5fcbda6153ac","Type":"ContainerStarted","Data":"b228ce95757ed8fa9ff46574d7883952e17a051be548a2f561755a89ecd9651b"} Nov 29 05:57:34 crc kubenswrapper[4594]: I1129 05:57:34.873480 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" podStartSLOduration=2.358515132 podStartE2EDuration="2.87344971s" podCreationTimestamp="2025-11-29 05:57:32 +0000 UTC" firstStartedPulling="2025-11-29 05:57:33.719764726 +0000 UTC m=+1777.960273946" lastFinishedPulling="2025-11-29 05:57:34.234699304 +0000 UTC m=+1778.475208524" observedRunningTime="2025-11-29 05:57:34.870619086 +0000 UTC m=+1779.111128306" watchObservedRunningTime="2025-11-29 05:57:34.87344971 +0000 UTC m=+1779.113958931" Nov 29 05:57:41 crc kubenswrapper[4594]: I1129 05:57:41.926983 4594 generic.go:334] "Generic (PLEG): container finished" podID="41f4ce3b-5711-4b51-a22d-5fcbda6153ac" containerID="b228ce95757ed8fa9ff46574d7883952e17a051be548a2f561755a89ecd9651b" exitCode=0 Nov 29 05:57:41 crc kubenswrapper[4594]: I1129 05:57:41.927080 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" event={"ID":"41f4ce3b-5711-4b51-a22d-5fcbda6153ac","Type":"ContainerDied","Data":"b228ce95757ed8fa9ff46574d7883952e17a051be548a2f561755a89ecd9651b"} Nov 29 05:57:43 crc kubenswrapper[4594]: I1129 05:57:43.283856 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" Nov 29 05:57:43 crc kubenswrapper[4594]: I1129 05:57:43.400736 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vwh6\" (UniqueName: \"kubernetes.io/projected/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-kube-api-access-4vwh6\") pod \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\" (UID: \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\") " Nov 29 05:57:43 crc kubenswrapper[4594]: I1129 05:57:43.401051 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-inventory\") pod \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\" (UID: \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\") " Nov 29 05:57:43 crc kubenswrapper[4594]: I1129 05:57:43.401330 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-ssh-key\") pod \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\" (UID: \"41f4ce3b-5711-4b51-a22d-5fcbda6153ac\") " Nov 29 05:57:43 crc kubenswrapper[4594]: I1129 05:57:43.407123 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-kube-api-access-4vwh6" (OuterVolumeSpecName: "kube-api-access-4vwh6") pod "41f4ce3b-5711-4b51-a22d-5fcbda6153ac" (UID: "41f4ce3b-5711-4b51-a22d-5fcbda6153ac"). InnerVolumeSpecName "kube-api-access-4vwh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:57:43 crc kubenswrapper[4594]: I1129 05:57:43.426179 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "41f4ce3b-5711-4b51-a22d-5fcbda6153ac" (UID: "41f4ce3b-5711-4b51-a22d-5fcbda6153ac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:57:43 crc kubenswrapper[4594]: I1129 05:57:43.427794 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-inventory" (OuterVolumeSpecName: "inventory") pod "41f4ce3b-5711-4b51-a22d-5fcbda6153ac" (UID: "41f4ce3b-5711-4b51-a22d-5fcbda6153ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:57:43 crc kubenswrapper[4594]: I1129 05:57:43.504571 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vwh6\" (UniqueName: \"kubernetes.io/projected/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-kube-api-access-4vwh6\") on node \"crc\" DevicePath \"\"" Nov 29 05:57:43 crc kubenswrapper[4594]: I1129 05:57:43.504596 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 05:57:43 crc kubenswrapper[4594]: I1129 05:57:43.504607 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41f4ce3b-5711-4b51-a22d-5fcbda6153ac-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:57:43 crc kubenswrapper[4594]: I1129 05:57:43.946587 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" event={"ID":"41f4ce3b-5711-4b51-a22d-5fcbda6153ac","Type":"ContainerDied","Data":"423d9db8c2226f9d3ecac0ed8c8ccd1a59ef3d7401786e70bceda669b3329ded"} Nov 29 05:57:43 crc kubenswrapper[4594]: I1129 05:57:43.946635 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="423d9db8c2226f9d3ecac0ed8c8ccd1a59ef3d7401786e70bceda669b3329ded" Nov 29 05:57:43 crc kubenswrapper[4594]: I1129 05:57:43.946675 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.018003 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx"] Nov 29 05:57:44 crc kubenswrapper[4594]: E1129 05:57:44.018502 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f4ce3b-5711-4b51-a22d-5fcbda6153ac" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.018523 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f4ce3b-5711-4b51-a22d-5fcbda6153ac" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.018747 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f4ce3b-5711-4b51-a22d-5fcbda6153ac" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.019484 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.022488 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.022633 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.022652 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.023280 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.023450 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.023565 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.024528 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.032073 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.032651 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx"] Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.051439 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6vm99"] Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.057636 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6vm99"] Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.098559 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a289c272-abe9-4098-951e-d6a10ce647ab" path="/var/lib/kubelet/pods/a289c272-abe9-4098-951e-d6a10ce647ab/volumes" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.115354 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4spxm\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-kube-api-access-4spxm\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.115434 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.115474 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.115573 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.115650 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.115797 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.115853 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.115916 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.115995 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.116097 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.116184 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.116216 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.116280 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.116442 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218066 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218116 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218179 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218210 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218275 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218298 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218328 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218363 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218403 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218448 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218472 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218501 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218545 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.218582 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4spxm\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-kube-api-access-4spxm\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.223559 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.224478 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.224675 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.225544 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.225755 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.225924 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.225941 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.226851 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.227385 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.227555 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.228477 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.228835 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.229086 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.233976 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4spxm\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-kube-api-access-4spxm\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tclxx\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.334462 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.803695 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx"] Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.809074 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 05:57:44 crc kubenswrapper[4594]: I1129 05:57:44.957508 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" event={"ID":"9dafdf20-2acb-46ad-adb3-d1421087ca5e","Type":"ContainerStarted","Data":"d2361c4c316c4b215455e27ffaf69e9d5832f215720ae360e50bd1699e4a4e64"} Nov 29 05:57:46 crc kubenswrapper[4594]: I1129 05:57:46.004634 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" event={"ID":"9dafdf20-2acb-46ad-adb3-d1421087ca5e","Type":"ContainerStarted","Data":"df3b91a3d5309630778a4411976cb30706e1b0b43e63eabcf3431aed426781ec"} Nov 29 05:57:46 crc kubenswrapper[4594]: I1129 05:57:46.034316 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" podStartSLOduration=2.498459041 podStartE2EDuration="3.034298958s" podCreationTimestamp="2025-11-29 05:57:43 +0000 UTC" firstStartedPulling="2025-11-29 05:57:44.808855374 +0000 UTC m=+1789.049364593" lastFinishedPulling="2025-11-29 05:57:45.344695289 +0000 UTC m=+1789.585204510" observedRunningTime="2025-11-29 05:57:46.024717699 +0000 UTC m=+1790.265226919" watchObservedRunningTime="2025-11-29 05:57:46.034298958 +0000 UTC m=+1790.274808178" Nov 29 05:57:48 crc kubenswrapper[4594]: I1129 05:57:48.083802 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:57:48 crc kubenswrapper[4594]: E1129 05:57:48.084708 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:58:01 crc kubenswrapper[4594]: I1129 05:58:01.083894 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:58:01 crc kubenswrapper[4594]: E1129 05:58:01.085073 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:58:09 crc kubenswrapper[4594]: I1129 05:58:09.881202 4594 scope.go:117] "RemoveContainer" containerID="3b020809d6a5dc264b4813f8f63cbb564324adb8cb346eaff676dea96997bb5a" Nov 29 05:58:13 crc kubenswrapper[4594]: I1129 05:58:13.083984 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:58:13 crc kubenswrapper[4594]: E1129 05:58:13.084926 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:58:13 crc kubenswrapper[4594]: I1129 05:58:13.284559 4594 generic.go:334] "Generic (PLEG): container finished" podID="9dafdf20-2acb-46ad-adb3-d1421087ca5e" containerID="df3b91a3d5309630778a4411976cb30706e1b0b43e63eabcf3431aed426781ec" exitCode=0 Nov 29 05:58:13 crc kubenswrapper[4594]: I1129 05:58:13.284628 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" event={"ID":"9dafdf20-2acb-46ad-adb3-d1421087ca5e","Type":"ContainerDied","Data":"df3b91a3d5309630778a4411976cb30706e1b0b43e63eabcf3431aed426781ec"} Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.640644 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.793368 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.793544 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4spxm\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-kube-api-access-4spxm\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.793572 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-nova-combined-ca-bundle\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.793631 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-inventory\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.793708 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-neutron-metadata-combined-ca-bundle\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.793735 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-libvirt-combined-ca-bundle\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.793847 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-ssh-key\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.793965 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-bootstrap-combined-ca-bundle\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.794021 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.794067 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-repo-setup-combined-ca-bundle\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.794125 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.794284 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-ovn-combined-ca-bundle\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.794330 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.794394 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-telemetry-combined-ca-bundle\") pod \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\" (UID: \"9dafdf20-2acb-46ad-adb3-d1421087ca5e\") " Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.800705 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-kube-api-access-4spxm" (OuterVolumeSpecName: "kube-api-access-4spxm") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "kube-api-access-4spxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.801422 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.802431 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.802543 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.802555 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.802586 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.803094 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.803194 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.803398 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.804412 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.805124 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.805657 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.826567 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-inventory" (OuterVolumeSpecName: "inventory") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.826586 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9dafdf20-2acb-46ad-adb3-d1421087ca5e" (UID: "9dafdf20-2acb-46ad-adb3-d1421087ca5e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.897958 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.897986 4594 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.898003 4594 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.898018 4594 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.898031 4594 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.898042 4594 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.898058 4594 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.898068 4594 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.898079 4594 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.898091 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4spxm\" (UniqueName: \"kubernetes.io/projected/9dafdf20-2acb-46ad-adb3-d1421087ca5e-kube-api-access-4spxm\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.898104 4594 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.898113 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.898123 4594 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:14 crc kubenswrapper[4594]: I1129 05:58:14.898135 4594 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafdf20-2acb-46ad-adb3-d1421087ca5e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.302811 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" event={"ID":"9dafdf20-2acb-46ad-adb3-d1421087ca5e","Type":"ContainerDied","Data":"d2361c4c316c4b215455e27ffaf69e9d5832f215720ae360e50bd1699e4a4e64"} Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.302865 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2361c4c316c4b215455e27ffaf69e9d5832f215720ae360e50bd1699e4a4e64" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.302880 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tclxx" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.404210 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm"] Nov 29 05:58:15 crc kubenswrapper[4594]: E1129 05:58:15.404990 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dafdf20-2acb-46ad-adb3-d1421087ca5e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.405016 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dafdf20-2acb-46ad-adb3-d1421087ca5e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.405541 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dafdf20-2acb-46ad-adb3-d1421087ca5e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.406560 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.410000 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.410372 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.410527 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.410576 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.410669 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.420120 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm"] Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.519141 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.519616 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.519778 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqk9d\" (UniqueName: \"kubernetes.io/projected/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-kube-api-access-pqk9d\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.520192 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.520313 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.622304 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqk9d\" (UniqueName: \"kubernetes.io/projected/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-kube-api-access-pqk9d\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.622431 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.622488 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.622584 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.622629 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.623635 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.627467 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.627695 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.629896 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.644968 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqk9d\" (UniqueName: \"kubernetes.io/projected/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-kube-api-access-pqk9d\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rgrwm\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:15 crc kubenswrapper[4594]: I1129 05:58:15.730510 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:58:16 crc kubenswrapper[4594]: I1129 05:58:16.210081 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm"] Nov 29 05:58:16 crc kubenswrapper[4594]: I1129 05:58:16.316574 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" event={"ID":"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7","Type":"ContainerStarted","Data":"f44f7e7e52246f0faf0729ba9c75907fbcecfe88173df8d7a5b37d98e2912d81"} Nov 29 05:58:17 crc kubenswrapper[4594]: I1129 05:58:17.340112 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" event={"ID":"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7","Type":"ContainerStarted","Data":"6da9b17a26c4eb4e7064086b2d6797eeb636286f3d242e9c33d9a4637dd0cd74"} Nov 29 05:58:17 crc kubenswrapper[4594]: I1129 05:58:17.363912 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" podStartSLOduration=1.739193163 podStartE2EDuration="2.363890649s" podCreationTimestamp="2025-11-29 05:58:15 +0000 UTC" firstStartedPulling="2025-11-29 05:58:16.216418629 +0000 UTC m=+1820.456927850" lastFinishedPulling="2025-11-29 05:58:16.841116115 +0000 UTC m=+1821.081625336" observedRunningTime="2025-11-29 05:58:17.354606118 +0000 UTC m=+1821.595115337" watchObservedRunningTime="2025-11-29 05:58:17.363890649 +0000 UTC m=+1821.604399869" Nov 29 05:58:24 crc kubenswrapper[4594]: I1129 05:58:24.084512 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:58:24 crc kubenswrapper[4594]: E1129 05:58:24.085594 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:58:36 crc kubenswrapper[4594]: I1129 05:58:36.092103 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:58:36 crc kubenswrapper[4594]: E1129 05:58:36.093234 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:58:47 crc kubenswrapper[4594]: I1129 05:58:47.083774 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:58:47 crc kubenswrapper[4594]: E1129 05:58:47.085193 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:58:58 crc kubenswrapper[4594]: I1129 05:58:58.084397 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:58:58 crc kubenswrapper[4594]: E1129 05:58:58.085369 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:59:01 crc kubenswrapper[4594]: I1129 05:59:01.813089 4594 generic.go:334] "Generic (PLEG): container finished" podID="3cd5d33f-c80d-49d9-97b7-26dc98be7fa7" containerID="6da9b17a26c4eb4e7064086b2d6797eeb636286f3d242e9c33d9a4637dd0cd74" exitCode=0 Nov 29 05:59:01 crc kubenswrapper[4594]: I1129 05:59:01.813183 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" event={"ID":"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7","Type":"ContainerDied","Data":"6da9b17a26c4eb4e7064086b2d6797eeb636286f3d242e9c33d9a4637dd0cd74"} Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.160820 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.202900 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ovncontroller-config-0\") pod \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.203171 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqk9d\" (UniqueName: \"kubernetes.io/projected/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-kube-api-access-pqk9d\") pod \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.203214 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ssh-key\") pod \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.203287 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-inventory\") pod \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.203515 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ovn-combined-ca-bundle\") pod \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\" (UID: \"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7\") " Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.210394 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3cd5d33f-c80d-49d9-97b7-26dc98be7fa7" (UID: "3cd5d33f-c80d-49d9-97b7-26dc98be7fa7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.211025 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-kube-api-access-pqk9d" (OuterVolumeSpecName: "kube-api-access-pqk9d") pod "3cd5d33f-c80d-49d9-97b7-26dc98be7fa7" (UID: "3cd5d33f-c80d-49d9-97b7-26dc98be7fa7"). InnerVolumeSpecName "kube-api-access-pqk9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.230112 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3cd5d33f-c80d-49d9-97b7-26dc98be7fa7" (UID: "3cd5d33f-c80d-49d9-97b7-26dc98be7fa7"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.230932 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-inventory" (OuterVolumeSpecName: "inventory") pod "3cd5d33f-c80d-49d9-97b7-26dc98be7fa7" (UID: "3cd5d33f-c80d-49d9-97b7-26dc98be7fa7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.230988 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3cd5d33f-c80d-49d9-97b7-26dc98be7fa7" (UID: "3cd5d33f-c80d-49d9-97b7-26dc98be7fa7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.305555 4594 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.305646 4594 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.305716 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqk9d\" (UniqueName: \"kubernetes.io/projected/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-kube-api-access-pqk9d\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.305771 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.305820 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cd5d33f-c80d-49d9-97b7-26dc98be7fa7-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.839523 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" event={"ID":"3cd5d33f-c80d-49d9-97b7-26dc98be7fa7","Type":"ContainerDied","Data":"f44f7e7e52246f0faf0729ba9c75907fbcecfe88173df8d7a5b37d98e2912d81"} Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.840214 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f44f7e7e52246f0faf0729ba9c75907fbcecfe88173df8d7a5b37d98e2912d81" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.839582 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rgrwm" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.954210 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx"] Nov 29 05:59:03 crc kubenswrapper[4594]: E1129 05:59:03.954831 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd5d33f-c80d-49d9-97b7-26dc98be7fa7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.954908 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd5d33f-c80d-49d9-97b7-26dc98be7fa7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.955186 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd5d33f-c80d-49d9-97b7-26dc98be7fa7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.955994 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.959834 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.959929 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.960648 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.960794 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.960571 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.964091 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 29 05:59:03 crc kubenswrapper[4594]: I1129 05:59:03.966468 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx"] Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.018381 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.019533 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.019653 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.019709 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67665\" (UniqueName: \"kubernetes.io/projected/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-kube-api-access-67665\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.020063 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.020106 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.121275 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.121314 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.121368 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.123195 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.123269 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.123311 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67665\" (UniqueName: \"kubernetes.io/projected/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-kube-api-access-67665\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.129018 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.129496 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.130075 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.130556 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.130993 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.139569 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67665\" (UniqueName: \"kubernetes.io/projected/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-kube-api-access-67665\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.271810 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.753913 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx"] Nov 29 05:59:04 crc kubenswrapper[4594]: I1129 05:59:04.852497 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" event={"ID":"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b","Type":"ContainerStarted","Data":"bec001917f247e7efa920867d72eb19b30d6110244a8397b243ae7ac0b4fd6ff"} Nov 29 05:59:05 crc kubenswrapper[4594]: I1129 05:59:05.864821 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" event={"ID":"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b","Type":"ContainerStarted","Data":"722d62b3a72d6259700c88414caf9515f07ea5c603ebe68f3cc211b297d32588"} Nov 29 05:59:05 crc kubenswrapper[4594]: I1129 05:59:05.890787 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" podStartSLOduration=2.245280733 podStartE2EDuration="2.890761698s" podCreationTimestamp="2025-11-29 05:59:03 +0000 UTC" firstStartedPulling="2025-11-29 05:59:04.755847266 +0000 UTC m=+1868.996356486" lastFinishedPulling="2025-11-29 05:59:05.401328241 +0000 UTC m=+1869.641837451" observedRunningTime="2025-11-29 05:59:05.877772068 +0000 UTC m=+1870.118281278" watchObservedRunningTime="2025-11-29 05:59:05.890761698 +0000 UTC m=+1870.131270908" Nov 29 05:59:08 crc kubenswrapper[4594]: I1129 05:59:08.992915 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8mjv6"] Nov 29 05:59:08 crc kubenswrapper[4594]: I1129 05:59:08.995577 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.006371 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mjv6"] Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.033132 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjzxt\" (UniqueName: \"kubernetes.io/projected/56294fbe-f3a2-4359-a77c-a21f100aa063-kube-api-access-pjzxt\") pod \"redhat-marketplace-8mjv6\" (UID: \"56294fbe-f3a2-4359-a77c-a21f100aa063\") " pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.033210 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56294fbe-f3a2-4359-a77c-a21f100aa063-catalog-content\") pod \"redhat-marketplace-8mjv6\" (UID: \"56294fbe-f3a2-4359-a77c-a21f100aa063\") " pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.033576 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56294fbe-f3a2-4359-a77c-a21f100aa063-utilities\") pod \"redhat-marketplace-8mjv6\" (UID: \"56294fbe-f3a2-4359-a77c-a21f100aa063\") " pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.135694 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjzxt\" (UniqueName: \"kubernetes.io/projected/56294fbe-f3a2-4359-a77c-a21f100aa063-kube-api-access-pjzxt\") pod \"redhat-marketplace-8mjv6\" (UID: \"56294fbe-f3a2-4359-a77c-a21f100aa063\") " pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.135772 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56294fbe-f3a2-4359-a77c-a21f100aa063-catalog-content\") pod \"redhat-marketplace-8mjv6\" (UID: \"56294fbe-f3a2-4359-a77c-a21f100aa063\") " pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.135937 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56294fbe-f3a2-4359-a77c-a21f100aa063-utilities\") pod \"redhat-marketplace-8mjv6\" (UID: \"56294fbe-f3a2-4359-a77c-a21f100aa063\") " pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.136382 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56294fbe-f3a2-4359-a77c-a21f100aa063-catalog-content\") pod \"redhat-marketplace-8mjv6\" (UID: \"56294fbe-f3a2-4359-a77c-a21f100aa063\") " pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.136551 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56294fbe-f3a2-4359-a77c-a21f100aa063-utilities\") pod \"redhat-marketplace-8mjv6\" (UID: \"56294fbe-f3a2-4359-a77c-a21f100aa063\") " pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.156885 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjzxt\" (UniqueName: \"kubernetes.io/projected/56294fbe-f3a2-4359-a77c-a21f100aa063-kube-api-access-pjzxt\") pod \"redhat-marketplace-8mjv6\" (UID: \"56294fbe-f3a2-4359-a77c-a21f100aa063\") " pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.315585 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.718372 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mjv6"] Nov 29 05:59:09 crc kubenswrapper[4594]: W1129 05:59:09.720538 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56294fbe_f3a2_4359_a77c_a21f100aa063.slice/crio-0e18e07fffaf558f37a328a53c054a29703c35b4e539cdce03b1d525a9e582ad WatchSource:0}: Error finding container 0e18e07fffaf558f37a328a53c054a29703c35b4e539cdce03b1d525a9e582ad: Status 404 returned error can't find the container with id 0e18e07fffaf558f37a328a53c054a29703c35b4e539cdce03b1d525a9e582ad Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.916573 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mjv6" event={"ID":"56294fbe-f3a2-4359-a77c-a21f100aa063","Type":"ContainerStarted","Data":"65af065e553ff27292e5e22bf9ce77f7438d6924766950be39763085401469c6"} Nov 29 05:59:09 crc kubenswrapper[4594]: I1129 05:59:09.917000 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mjv6" event={"ID":"56294fbe-f3a2-4359-a77c-a21f100aa063","Type":"ContainerStarted","Data":"0e18e07fffaf558f37a328a53c054a29703c35b4e539cdce03b1d525a9e582ad"} Nov 29 05:59:10 crc kubenswrapper[4594]: I1129 05:59:10.084604 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:59:10 crc kubenswrapper[4594]: E1129 05:59:10.085021 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:59:10 crc kubenswrapper[4594]: I1129 05:59:10.929974 4594 generic.go:334] "Generic (PLEG): container finished" podID="56294fbe-f3a2-4359-a77c-a21f100aa063" containerID="65af065e553ff27292e5e22bf9ce77f7438d6924766950be39763085401469c6" exitCode=0 Nov 29 05:59:10 crc kubenswrapper[4594]: I1129 05:59:10.930151 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mjv6" event={"ID":"56294fbe-f3a2-4359-a77c-a21f100aa063","Type":"ContainerDied","Data":"65af065e553ff27292e5e22bf9ce77f7438d6924766950be39763085401469c6"} Nov 29 05:59:11 crc kubenswrapper[4594]: I1129 05:59:11.947470 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mjv6" event={"ID":"56294fbe-f3a2-4359-a77c-a21f100aa063","Type":"ContainerStarted","Data":"ccf0e65fc5f66b87e736b7865a9012347fcfd3eefe15905df67a5561f4e4f73e"} Nov 29 05:59:12 crc kubenswrapper[4594]: I1129 05:59:12.960538 4594 generic.go:334] "Generic (PLEG): container finished" podID="56294fbe-f3a2-4359-a77c-a21f100aa063" containerID="ccf0e65fc5f66b87e736b7865a9012347fcfd3eefe15905df67a5561f4e4f73e" exitCode=0 Nov 29 05:59:12 crc kubenswrapper[4594]: I1129 05:59:12.960587 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mjv6" event={"ID":"56294fbe-f3a2-4359-a77c-a21f100aa063","Type":"ContainerDied","Data":"ccf0e65fc5f66b87e736b7865a9012347fcfd3eefe15905df67a5561f4e4f73e"} Nov 29 05:59:13 crc kubenswrapper[4594]: I1129 05:59:13.976489 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mjv6" event={"ID":"56294fbe-f3a2-4359-a77c-a21f100aa063","Type":"ContainerStarted","Data":"ef8f4fd1d91993b90d7412a5d2fe26fc06bde1755394721a606550fdaefa5785"} Nov 29 05:59:13 crc kubenswrapper[4594]: I1129 05:59:13.999480 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8mjv6" podStartSLOduration=3.423572948 podStartE2EDuration="5.999463981s" podCreationTimestamp="2025-11-29 05:59:08 +0000 UTC" firstStartedPulling="2025-11-29 05:59:10.932638026 +0000 UTC m=+1875.173147247" lastFinishedPulling="2025-11-29 05:59:13.50852906 +0000 UTC m=+1877.749038280" observedRunningTime="2025-11-29 05:59:13.995351125 +0000 UTC m=+1878.235860334" watchObservedRunningTime="2025-11-29 05:59:13.999463981 +0000 UTC m=+1878.239973200" Nov 29 05:59:19 crc kubenswrapper[4594]: I1129 05:59:19.316550 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:19 crc kubenswrapper[4594]: I1129 05:59:19.317213 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:19 crc kubenswrapper[4594]: I1129 05:59:19.356994 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:20 crc kubenswrapper[4594]: I1129 05:59:20.097720 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:20 crc kubenswrapper[4594]: I1129 05:59:20.158049 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mjv6"] Nov 29 05:59:21 crc kubenswrapper[4594]: I1129 05:59:21.083396 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:59:21 crc kubenswrapper[4594]: E1129 05:59:21.084001 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:59:22 crc kubenswrapper[4594]: I1129 05:59:22.065327 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8mjv6" podUID="56294fbe-f3a2-4359-a77c-a21f100aa063" containerName="registry-server" containerID="cri-o://ef8f4fd1d91993b90d7412a5d2fe26fc06bde1755394721a606550fdaefa5785" gracePeriod=2 Nov 29 05:59:22 crc kubenswrapper[4594]: I1129 05:59:22.487931 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:22 crc kubenswrapper[4594]: I1129 05:59:22.626810 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjzxt\" (UniqueName: \"kubernetes.io/projected/56294fbe-f3a2-4359-a77c-a21f100aa063-kube-api-access-pjzxt\") pod \"56294fbe-f3a2-4359-a77c-a21f100aa063\" (UID: \"56294fbe-f3a2-4359-a77c-a21f100aa063\") " Nov 29 05:59:22 crc kubenswrapper[4594]: I1129 05:59:22.626913 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56294fbe-f3a2-4359-a77c-a21f100aa063-catalog-content\") pod \"56294fbe-f3a2-4359-a77c-a21f100aa063\" (UID: \"56294fbe-f3a2-4359-a77c-a21f100aa063\") " Nov 29 05:59:22 crc kubenswrapper[4594]: I1129 05:59:22.626948 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56294fbe-f3a2-4359-a77c-a21f100aa063-utilities\") pod \"56294fbe-f3a2-4359-a77c-a21f100aa063\" (UID: \"56294fbe-f3a2-4359-a77c-a21f100aa063\") " Nov 29 05:59:22 crc kubenswrapper[4594]: I1129 05:59:22.627825 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56294fbe-f3a2-4359-a77c-a21f100aa063-utilities" (OuterVolumeSpecName: "utilities") pod "56294fbe-f3a2-4359-a77c-a21f100aa063" (UID: "56294fbe-f3a2-4359-a77c-a21f100aa063"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:59:22 crc kubenswrapper[4594]: I1129 05:59:22.634002 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56294fbe-f3a2-4359-a77c-a21f100aa063-kube-api-access-pjzxt" (OuterVolumeSpecName: "kube-api-access-pjzxt") pod "56294fbe-f3a2-4359-a77c-a21f100aa063" (UID: "56294fbe-f3a2-4359-a77c-a21f100aa063"). InnerVolumeSpecName "kube-api-access-pjzxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:59:22 crc kubenswrapper[4594]: I1129 05:59:22.643955 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56294fbe-f3a2-4359-a77c-a21f100aa063-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56294fbe-f3a2-4359-a77c-a21f100aa063" (UID: "56294fbe-f3a2-4359-a77c-a21f100aa063"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 05:59:22 crc kubenswrapper[4594]: I1129 05:59:22.729222 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjzxt\" (UniqueName: \"kubernetes.io/projected/56294fbe-f3a2-4359-a77c-a21f100aa063-kube-api-access-pjzxt\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:22 crc kubenswrapper[4594]: I1129 05:59:22.729266 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56294fbe-f3a2-4359-a77c-a21f100aa063-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:22 crc kubenswrapper[4594]: I1129 05:59:22.729279 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56294fbe-f3a2-4359-a77c-a21f100aa063-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.078903 4594 generic.go:334] "Generic (PLEG): container finished" podID="56294fbe-f3a2-4359-a77c-a21f100aa063" containerID="ef8f4fd1d91993b90d7412a5d2fe26fc06bde1755394721a606550fdaefa5785" exitCode=0 Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.078963 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mjv6" event={"ID":"56294fbe-f3a2-4359-a77c-a21f100aa063","Type":"ContainerDied","Data":"ef8f4fd1d91993b90d7412a5d2fe26fc06bde1755394721a606550fdaefa5785"} Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.078972 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mjv6" Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.079013 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mjv6" event={"ID":"56294fbe-f3a2-4359-a77c-a21f100aa063","Type":"ContainerDied","Data":"0e18e07fffaf558f37a328a53c054a29703c35b4e539cdce03b1d525a9e582ad"} Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.079037 4594 scope.go:117] "RemoveContainer" containerID="ef8f4fd1d91993b90d7412a5d2fe26fc06bde1755394721a606550fdaefa5785" Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.108924 4594 scope.go:117] "RemoveContainer" containerID="ccf0e65fc5f66b87e736b7865a9012347fcfd3eefe15905df67a5561f4e4f73e" Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.115273 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mjv6"] Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.131181 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mjv6"] Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.138538 4594 scope.go:117] "RemoveContainer" containerID="65af065e553ff27292e5e22bf9ce77f7438d6924766950be39763085401469c6" Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.173078 4594 scope.go:117] "RemoveContainer" containerID="ef8f4fd1d91993b90d7412a5d2fe26fc06bde1755394721a606550fdaefa5785" Nov 29 05:59:23 crc kubenswrapper[4594]: E1129 05:59:23.173667 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef8f4fd1d91993b90d7412a5d2fe26fc06bde1755394721a606550fdaefa5785\": container with ID starting with ef8f4fd1d91993b90d7412a5d2fe26fc06bde1755394721a606550fdaefa5785 not found: ID does not exist" containerID="ef8f4fd1d91993b90d7412a5d2fe26fc06bde1755394721a606550fdaefa5785" Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.173705 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef8f4fd1d91993b90d7412a5d2fe26fc06bde1755394721a606550fdaefa5785"} err="failed to get container status \"ef8f4fd1d91993b90d7412a5d2fe26fc06bde1755394721a606550fdaefa5785\": rpc error: code = NotFound desc = could not find container \"ef8f4fd1d91993b90d7412a5d2fe26fc06bde1755394721a606550fdaefa5785\": container with ID starting with ef8f4fd1d91993b90d7412a5d2fe26fc06bde1755394721a606550fdaefa5785 not found: ID does not exist" Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.173762 4594 scope.go:117] "RemoveContainer" containerID="ccf0e65fc5f66b87e736b7865a9012347fcfd3eefe15905df67a5561f4e4f73e" Nov 29 05:59:23 crc kubenswrapper[4594]: E1129 05:59:23.174326 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccf0e65fc5f66b87e736b7865a9012347fcfd3eefe15905df67a5561f4e4f73e\": container with ID starting with ccf0e65fc5f66b87e736b7865a9012347fcfd3eefe15905df67a5561f4e4f73e not found: ID does not exist" containerID="ccf0e65fc5f66b87e736b7865a9012347fcfd3eefe15905df67a5561f4e4f73e" Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.174354 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccf0e65fc5f66b87e736b7865a9012347fcfd3eefe15905df67a5561f4e4f73e"} err="failed to get container status \"ccf0e65fc5f66b87e736b7865a9012347fcfd3eefe15905df67a5561f4e4f73e\": rpc error: code = NotFound desc = could not find container \"ccf0e65fc5f66b87e736b7865a9012347fcfd3eefe15905df67a5561f4e4f73e\": container with ID starting with ccf0e65fc5f66b87e736b7865a9012347fcfd3eefe15905df67a5561f4e4f73e not found: ID does not exist" Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.174370 4594 scope.go:117] "RemoveContainer" containerID="65af065e553ff27292e5e22bf9ce77f7438d6924766950be39763085401469c6" Nov 29 05:59:23 crc kubenswrapper[4594]: E1129 05:59:23.174716 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65af065e553ff27292e5e22bf9ce77f7438d6924766950be39763085401469c6\": container with ID starting with 65af065e553ff27292e5e22bf9ce77f7438d6924766950be39763085401469c6 not found: ID does not exist" containerID="65af065e553ff27292e5e22bf9ce77f7438d6924766950be39763085401469c6" Nov 29 05:59:23 crc kubenswrapper[4594]: I1129 05:59:23.174760 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65af065e553ff27292e5e22bf9ce77f7438d6924766950be39763085401469c6"} err="failed to get container status \"65af065e553ff27292e5e22bf9ce77f7438d6924766950be39763085401469c6\": rpc error: code = NotFound desc = could not find container \"65af065e553ff27292e5e22bf9ce77f7438d6924766950be39763085401469c6\": container with ID starting with 65af065e553ff27292e5e22bf9ce77f7438d6924766950be39763085401469c6 not found: ID does not exist" Nov 29 05:59:24 crc kubenswrapper[4594]: I1129 05:59:24.093782 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56294fbe-f3a2-4359-a77c-a21f100aa063" path="/var/lib/kubelet/pods/56294fbe-f3a2-4359-a77c-a21f100aa063/volumes" Nov 29 05:59:35 crc kubenswrapper[4594]: I1129 05:59:35.083924 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:59:35 crc kubenswrapper[4594]: E1129 05:59:35.084878 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 05:59:40 crc kubenswrapper[4594]: I1129 05:59:40.262956 4594 generic.go:334] "Generic (PLEG): container finished" podID="a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b" containerID="722d62b3a72d6259700c88414caf9515f07ea5c603ebe68f3cc211b297d32588" exitCode=0 Nov 29 05:59:40 crc kubenswrapper[4594]: I1129 05:59:40.263041 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" event={"ID":"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b","Type":"ContainerDied","Data":"722d62b3a72d6259700c88414caf9515f07ea5c603ebe68f3cc211b297d32588"} Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.602041 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.773875 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-inventory\") pod \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.774079 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.774171 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67665\" (UniqueName: \"kubernetes.io/projected/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-kube-api-access-67665\") pod \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.774215 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-ssh-key\") pod \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.774292 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-nova-metadata-neutron-config-0\") pod \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.774328 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-neutron-metadata-combined-ca-bundle\") pod \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\" (UID: \"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b\") " Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.780811 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b" (UID: "a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.780885 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-kube-api-access-67665" (OuterVolumeSpecName: "kube-api-access-67665") pod "a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b" (UID: "a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b"). InnerVolumeSpecName "kube-api-access-67665". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.799020 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b" (UID: "a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.803005 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b" (UID: "a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.804849 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-inventory" (OuterVolumeSpecName: "inventory") pod "a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b" (UID: "a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.804990 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b" (UID: "a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.877625 4594 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.877916 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67665\" (UniqueName: \"kubernetes.io/projected/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-kube-api-access-67665\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.877927 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.877945 4594 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.877961 4594 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:41 crc kubenswrapper[4594]: I1129 05:59:41.877972 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.285247 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" event={"ID":"a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b","Type":"ContainerDied","Data":"bec001917f247e7efa920867d72eb19b30d6110244a8397b243ae7ac0b4fd6ff"} Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.285310 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bec001917f247e7efa920867d72eb19b30d6110244a8397b243ae7ac0b4fd6ff" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.285321 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.367165 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc"] Nov 29 05:59:42 crc kubenswrapper[4594]: E1129 05:59:42.367788 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56294fbe-f3a2-4359-a77c-a21f100aa063" containerName="registry-server" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.367811 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="56294fbe-f3a2-4359-a77c-a21f100aa063" containerName="registry-server" Nov 29 05:59:42 crc kubenswrapper[4594]: E1129 05:59:42.367825 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56294fbe-f3a2-4359-a77c-a21f100aa063" containerName="extract-content" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.367831 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="56294fbe-f3a2-4359-a77c-a21f100aa063" containerName="extract-content" Nov 29 05:59:42 crc kubenswrapper[4594]: E1129 05:59:42.367846 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56294fbe-f3a2-4359-a77c-a21f100aa063" containerName="extract-utilities" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.367852 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="56294fbe-f3a2-4359-a77c-a21f100aa063" containerName="extract-utilities" Nov 29 05:59:42 crc kubenswrapper[4594]: E1129 05:59:42.367861 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.367867 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.368120 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="56294fbe-f3a2-4359-a77c-a21f100aa063" containerName="registry-server" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.368156 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.369075 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.370802 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.370948 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.371048 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.372863 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.373163 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.377935 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc"] Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.415398 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.415440 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wwcr\" (UniqueName: \"kubernetes.io/projected/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-kube-api-access-8wwcr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.415670 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.416097 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.416130 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.517297 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.517339 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.517385 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.517412 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwcr\" (UniqueName: \"kubernetes.io/projected/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-kube-api-access-8wwcr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.517481 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.523915 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.523962 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.524613 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.526135 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.535500 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wwcr\" (UniqueName: \"kubernetes.io/projected/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-kube-api-access-8wwcr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:42 crc kubenswrapper[4594]: I1129 05:59:42.691762 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 05:59:43 crc kubenswrapper[4594]: I1129 05:59:43.160063 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc"] Nov 29 05:59:43 crc kubenswrapper[4594]: I1129 05:59:43.294700 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" event={"ID":"ec93e11c-8754-4e8c-8e75-d563fb7cef1f","Type":"ContainerStarted","Data":"a442d7b13627f69e5882b806684dac398fc0eb7b8a78041a867d87242a338347"} Nov 29 05:59:44 crc kubenswrapper[4594]: I1129 05:59:44.306441 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" event={"ID":"ec93e11c-8754-4e8c-8e75-d563fb7cef1f","Type":"ContainerStarted","Data":"4ceb1520ad79c0837a943154e6284dcb24d8fc25c39d3266d79a7aadec0e34e9"} Nov 29 05:59:44 crc kubenswrapper[4594]: I1129 05:59:44.323427 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" podStartSLOduration=1.6096541370000002 podStartE2EDuration="2.323407898s" podCreationTimestamp="2025-11-29 05:59:42 +0000 UTC" firstStartedPulling="2025-11-29 05:59:43.164592249 +0000 UTC m=+1907.405101469" lastFinishedPulling="2025-11-29 05:59:43.878346011 +0000 UTC m=+1908.118855230" observedRunningTime="2025-11-29 05:59:44.320210886 +0000 UTC m=+1908.560720106" watchObservedRunningTime="2025-11-29 05:59:44.323407898 +0000 UTC m=+1908.563917118" Nov 29 05:59:47 crc kubenswrapper[4594]: I1129 05:59:47.084797 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 05:59:47 crc kubenswrapper[4594]: E1129 05:59:47.085552 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.137120 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l"] Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.139168 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.140682 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.140949 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.146645 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l"] Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.192823 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-config-volume\") pod \"collect-profiles-29406600-xdz2l\" (UID: \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.192925 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5p4v\" (UniqueName: \"kubernetes.io/projected/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-kube-api-access-k5p4v\") pod \"collect-profiles-29406600-xdz2l\" (UID: \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.193018 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-secret-volume\") pod \"collect-profiles-29406600-xdz2l\" (UID: \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.295683 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5p4v\" (UniqueName: \"kubernetes.io/projected/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-kube-api-access-k5p4v\") pod \"collect-profiles-29406600-xdz2l\" (UID: \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.296190 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-secret-volume\") pod \"collect-profiles-29406600-xdz2l\" (UID: \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.296659 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-config-volume\") pod \"collect-profiles-29406600-xdz2l\" (UID: \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.297791 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-config-volume\") pod \"collect-profiles-29406600-xdz2l\" (UID: \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.302576 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-secret-volume\") pod \"collect-profiles-29406600-xdz2l\" (UID: \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.314184 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5p4v\" (UniqueName: \"kubernetes.io/projected/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-kube-api-access-k5p4v\") pod \"collect-profiles-29406600-xdz2l\" (UID: \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.456720 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" Nov 29 06:00:00 crc kubenswrapper[4594]: I1129 06:00:00.890755 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l"] Nov 29 06:00:01 crc kubenswrapper[4594]: I1129 06:00:01.470029 4594 generic.go:334] "Generic (PLEG): container finished" podID="1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888" containerID="bf251573be3f1fab04ba9f80127e4646b742104bba129cdba0648bee4c952979" exitCode=0 Nov 29 06:00:01 crc kubenswrapper[4594]: I1129 06:00:01.470115 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" event={"ID":"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888","Type":"ContainerDied","Data":"bf251573be3f1fab04ba9f80127e4646b742104bba129cdba0648bee4c952979"} Nov 29 06:00:01 crc kubenswrapper[4594]: I1129 06:00:01.470969 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" event={"ID":"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888","Type":"ContainerStarted","Data":"ddfc63387d554ef8d55badc0989a30016f7a1a614d4ff159b496bfde7c9b10e2"} Nov 29 06:00:02 crc kubenswrapper[4594]: I1129 06:00:02.084243 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 06:00:02 crc kubenswrapper[4594]: E1129 06:00:02.085044 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:00:02 crc kubenswrapper[4594]: I1129 06:00:02.765645 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" Nov 29 06:00:02 crc kubenswrapper[4594]: I1129 06:00:02.848690 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5p4v\" (UniqueName: \"kubernetes.io/projected/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-kube-api-access-k5p4v\") pod \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\" (UID: \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\") " Nov 29 06:00:02 crc kubenswrapper[4594]: I1129 06:00:02.856551 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-kube-api-access-k5p4v" (OuterVolumeSpecName: "kube-api-access-k5p4v") pod "1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888" (UID: "1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888"). InnerVolumeSpecName "kube-api-access-k5p4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:00:02 crc kubenswrapper[4594]: I1129 06:00:02.951048 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-secret-volume\") pod \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\" (UID: \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\") " Nov 29 06:00:02 crc kubenswrapper[4594]: I1129 06:00:02.951111 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-config-volume\") pod \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\" (UID: \"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888\") " Nov 29 06:00:02 crc kubenswrapper[4594]: I1129 06:00:02.951840 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-config-volume" (OuterVolumeSpecName: "config-volume") pod "1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888" (UID: "1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 06:00:02 crc kubenswrapper[4594]: I1129 06:00:02.952562 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5p4v\" (UniqueName: \"kubernetes.io/projected/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-kube-api-access-k5p4v\") on node \"crc\" DevicePath \"\"" Nov 29 06:00:02 crc kubenswrapper[4594]: I1129 06:00:02.952586 4594 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 06:00:02 crc kubenswrapper[4594]: I1129 06:00:02.955927 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888" (UID: "1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:00:03 crc kubenswrapper[4594]: I1129 06:00:03.054738 4594 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 06:00:03 crc kubenswrapper[4594]: I1129 06:00:03.491148 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" event={"ID":"1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888","Type":"ContainerDied","Data":"ddfc63387d554ef8d55badc0989a30016f7a1a614d4ff159b496bfde7c9b10e2"} Nov 29 06:00:03 crc kubenswrapper[4594]: I1129 06:00:03.491239 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddfc63387d554ef8d55badc0989a30016f7a1a614d4ff159b496bfde7c9b10e2" Nov 29 06:00:03 crc kubenswrapper[4594]: I1129 06:00:03.491367 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l" Nov 29 06:00:14 crc kubenswrapper[4594]: I1129 06:00:14.083460 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 06:00:14 crc kubenswrapper[4594]: E1129 06:00:14.084582 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:00:26 crc kubenswrapper[4594]: I1129 06:00:26.090734 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 06:00:26 crc kubenswrapper[4594]: E1129 06:00:26.093133 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:00:37 crc kubenswrapper[4594]: I1129 06:00:37.083311 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 06:00:37 crc kubenswrapper[4594]: E1129 06:00:37.084143 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:00:52 crc kubenswrapper[4594]: I1129 06:00:52.083216 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 06:00:52 crc kubenswrapper[4594]: E1129 06:00:52.084033 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.147389 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29406601-5wlh9"] Nov 29 06:01:00 crc kubenswrapper[4594]: E1129 06:01:00.148731 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888" containerName="collect-profiles" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.148748 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888" containerName="collect-profiles" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.148958 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888" containerName="collect-profiles" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.149865 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.156016 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29406601-5wlh9"] Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.339682 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzkts\" (UniqueName: \"kubernetes.io/projected/45955c1f-8326-47e3-ba4b-3c6ea134e496-kube-api-access-pzkts\") pod \"keystone-cron-29406601-5wlh9\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.339743 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-combined-ca-bundle\") pod \"keystone-cron-29406601-5wlh9\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.339777 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-config-data\") pod \"keystone-cron-29406601-5wlh9\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.339922 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-fernet-keys\") pod \"keystone-cron-29406601-5wlh9\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.441821 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzkts\" (UniqueName: \"kubernetes.io/projected/45955c1f-8326-47e3-ba4b-3c6ea134e496-kube-api-access-pzkts\") pod \"keystone-cron-29406601-5wlh9\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.441879 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-combined-ca-bundle\") pod \"keystone-cron-29406601-5wlh9\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.441902 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-config-data\") pod \"keystone-cron-29406601-5wlh9\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.441963 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-fernet-keys\") pod \"keystone-cron-29406601-5wlh9\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.448167 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-fernet-keys\") pod \"keystone-cron-29406601-5wlh9\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.448496 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-config-data\") pod \"keystone-cron-29406601-5wlh9\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.448988 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-combined-ca-bundle\") pod \"keystone-cron-29406601-5wlh9\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.456599 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzkts\" (UniqueName: \"kubernetes.io/projected/45955c1f-8326-47e3-ba4b-3c6ea134e496-kube-api-access-pzkts\") pod \"keystone-cron-29406601-5wlh9\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.477457 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.888692 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29406601-5wlh9"] Nov 29 06:01:00 crc kubenswrapper[4594]: I1129 06:01:00.999665 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29406601-5wlh9" event={"ID":"45955c1f-8326-47e3-ba4b-3c6ea134e496","Type":"ContainerStarted","Data":"d5fe466d61770c56dbbe104f7cc22a9392504461cba7c3bd1edf0c6c5ae8bc26"} Nov 29 06:01:02 crc kubenswrapper[4594]: I1129 06:01:02.009118 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29406601-5wlh9" event={"ID":"45955c1f-8326-47e3-ba4b-3c6ea134e496","Type":"ContainerStarted","Data":"2d46c332b2421bfb63f5f96965b7ff5535435a7575733df6ac3776ea243bffae"} Nov 29 06:01:02 crc kubenswrapper[4594]: I1129 06:01:02.028396 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29406601-5wlh9" podStartSLOduration=2.028379512 podStartE2EDuration="2.028379512s" podCreationTimestamp="2025-11-29 06:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 06:01:02.022564254 +0000 UTC m=+1986.263073475" watchObservedRunningTime="2025-11-29 06:01:02.028379512 +0000 UTC m=+1986.268888732" Nov 29 06:01:03 crc kubenswrapper[4594]: I1129 06:01:03.084574 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 06:01:03 crc kubenswrapper[4594]: E1129 06:01:03.085471 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:01:04 crc kubenswrapper[4594]: I1129 06:01:04.029497 4594 generic.go:334] "Generic (PLEG): container finished" podID="45955c1f-8326-47e3-ba4b-3c6ea134e496" containerID="2d46c332b2421bfb63f5f96965b7ff5535435a7575733df6ac3776ea243bffae" exitCode=0 Nov 29 06:01:04 crc kubenswrapper[4594]: I1129 06:01:04.029545 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29406601-5wlh9" event={"ID":"45955c1f-8326-47e3-ba4b-3c6ea134e496","Type":"ContainerDied","Data":"2d46c332b2421bfb63f5f96965b7ff5535435a7575733df6ac3776ea243bffae"} Nov 29 06:01:05 crc kubenswrapper[4594]: I1129 06:01:05.323765 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:05 crc kubenswrapper[4594]: I1129 06:01:05.459112 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzkts\" (UniqueName: \"kubernetes.io/projected/45955c1f-8326-47e3-ba4b-3c6ea134e496-kube-api-access-pzkts\") pod \"45955c1f-8326-47e3-ba4b-3c6ea134e496\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " Nov 29 06:01:05 crc kubenswrapper[4594]: I1129 06:01:05.459241 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-config-data\") pod \"45955c1f-8326-47e3-ba4b-3c6ea134e496\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " Nov 29 06:01:05 crc kubenswrapper[4594]: I1129 06:01:05.459299 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-combined-ca-bundle\") pod \"45955c1f-8326-47e3-ba4b-3c6ea134e496\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " Nov 29 06:01:05 crc kubenswrapper[4594]: I1129 06:01:05.459365 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-fernet-keys\") pod \"45955c1f-8326-47e3-ba4b-3c6ea134e496\" (UID: \"45955c1f-8326-47e3-ba4b-3c6ea134e496\") " Nov 29 06:01:05 crc kubenswrapper[4594]: I1129 06:01:05.464371 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "45955c1f-8326-47e3-ba4b-3c6ea134e496" (UID: "45955c1f-8326-47e3-ba4b-3c6ea134e496"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:01:05 crc kubenswrapper[4594]: I1129 06:01:05.465806 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45955c1f-8326-47e3-ba4b-3c6ea134e496-kube-api-access-pzkts" (OuterVolumeSpecName: "kube-api-access-pzkts") pod "45955c1f-8326-47e3-ba4b-3c6ea134e496" (UID: "45955c1f-8326-47e3-ba4b-3c6ea134e496"). InnerVolumeSpecName "kube-api-access-pzkts". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:01:05 crc kubenswrapper[4594]: I1129 06:01:05.483704 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45955c1f-8326-47e3-ba4b-3c6ea134e496" (UID: "45955c1f-8326-47e3-ba4b-3c6ea134e496"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:01:05 crc kubenswrapper[4594]: I1129 06:01:05.502424 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-config-data" (OuterVolumeSpecName: "config-data") pod "45955c1f-8326-47e3-ba4b-3c6ea134e496" (UID: "45955c1f-8326-47e3-ba4b-3c6ea134e496"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:01:05 crc kubenswrapper[4594]: I1129 06:01:05.561428 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzkts\" (UniqueName: \"kubernetes.io/projected/45955c1f-8326-47e3-ba4b-3c6ea134e496-kube-api-access-pzkts\") on node \"crc\" DevicePath \"\"" Nov 29 06:01:05 crc kubenswrapper[4594]: I1129 06:01:05.561462 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 06:01:05 crc kubenswrapper[4594]: I1129 06:01:05.561475 4594 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 06:01:05 crc kubenswrapper[4594]: I1129 06:01:05.561486 4594 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45955c1f-8326-47e3-ba4b-3c6ea134e496-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 06:01:06 crc kubenswrapper[4594]: I1129 06:01:06.060071 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29406601-5wlh9" event={"ID":"45955c1f-8326-47e3-ba4b-3c6ea134e496","Type":"ContainerDied","Data":"d5fe466d61770c56dbbe104f7cc22a9392504461cba7c3bd1edf0c6c5ae8bc26"} Nov 29 06:01:06 crc kubenswrapper[4594]: I1129 06:01:06.060125 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5fe466d61770c56dbbe104f7cc22a9392504461cba7c3bd1edf0c6c5ae8bc26" Nov 29 06:01:06 crc kubenswrapper[4594]: I1129 06:01:06.060656 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29406601-5wlh9" Nov 29 06:01:14 crc kubenswrapper[4594]: I1129 06:01:14.084249 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 06:01:14 crc kubenswrapper[4594]: E1129 06:01:14.085139 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:01:26 crc kubenswrapper[4594]: I1129 06:01:26.089106 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 06:01:27 crc kubenswrapper[4594]: I1129 06:01:27.243662 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"9e18bd53322063d829249c7999cf4c97247a11382fee69b79893aa2fbd67ae57"} Nov 29 06:02:39 crc kubenswrapper[4594]: I1129 06:02:39.968501 4594 generic.go:334] "Generic (PLEG): container finished" podID="ec93e11c-8754-4e8c-8e75-d563fb7cef1f" containerID="4ceb1520ad79c0837a943154e6284dcb24d8fc25c39d3266d79a7aadec0e34e9" exitCode=0 Nov 29 06:02:39 crc kubenswrapper[4594]: I1129 06:02:39.968573 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" event={"ID":"ec93e11c-8754-4e8c-8e75-d563fb7cef1f","Type":"ContainerDied","Data":"4ceb1520ad79c0837a943154e6284dcb24d8fc25c39d3266d79a7aadec0e34e9"} Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.319475 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.500746 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-ssh-key\") pod \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.500856 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-libvirt-combined-ca-bundle\") pod \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.501055 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-inventory\") pod \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.501086 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-libvirt-secret-0\") pod \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.501114 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wwcr\" (UniqueName: \"kubernetes.io/projected/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-kube-api-access-8wwcr\") pod \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\" (UID: \"ec93e11c-8754-4e8c-8e75-d563fb7cef1f\") " Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.508533 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-kube-api-access-8wwcr" (OuterVolumeSpecName: "kube-api-access-8wwcr") pod "ec93e11c-8754-4e8c-8e75-d563fb7cef1f" (UID: "ec93e11c-8754-4e8c-8e75-d563fb7cef1f"). InnerVolumeSpecName "kube-api-access-8wwcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.509347 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "ec93e11c-8754-4e8c-8e75-d563fb7cef1f" (UID: "ec93e11c-8754-4e8c-8e75-d563fb7cef1f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.526575 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "ec93e11c-8754-4e8c-8e75-d563fb7cef1f" (UID: "ec93e11c-8754-4e8c-8e75-d563fb7cef1f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.528046 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ec93e11c-8754-4e8c-8e75-d563fb7cef1f" (UID: "ec93e11c-8754-4e8c-8e75-d563fb7cef1f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.528457 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-inventory" (OuterVolumeSpecName: "inventory") pod "ec93e11c-8754-4e8c-8e75-d563fb7cef1f" (UID: "ec93e11c-8754-4e8c-8e75-d563fb7cef1f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.604213 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.604266 4594 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.604281 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.604292 4594 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.604303 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wwcr\" (UniqueName: \"kubernetes.io/projected/ec93e11c-8754-4e8c-8e75-d563fb7cef1f-kube-api-access-8wwcr\") on node \"crc\" DevicePath \"\"" Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.990024 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" event={"ID":"ec93e11c-8754-4e8c-8e75-d563fb7cef1f","Type":"ContainerDied","Data":"a442d7b13627f69e5882b806684dac398fc0eb7b8a78041a867d87242a338347"} Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.990099 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a442d7b13627f69e5882b806684dac398fc0eb7b8a78041a867d87242a338347" Nov 29 06:02:41 crc kubenswrapper[4594]: I1129 06:02:41.990117 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.070192 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng"] Nov 29 06:02:42 crc kubenswrapper[4594]: E1129 06:02:42.070639 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45955c1f-8326-47e3-ba4b-3c6ea134e496" containerName="keystone-cron" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.070661 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="45955c1f-8326-47e3-ba4b-3c6ea134e496" containerName="keystone-cron" Nov 29 06:02:42 crc kubenswrapper[4594]: E1129 06:02:42.070702 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec93e11c-8754-4e8c-8e75-d563fb7cef1f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.070709 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec93e11c-8754-4e8c-8e75-d563fb7cef1f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.070918 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="45955c1f-8326-47e3-ba4b-3c6ea134e496" containerName="keystone-cron" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.070949 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec93e11c-8754-4e8c-8e75-d563fb7cef1f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.071661 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.073708 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.073998 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.075072 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.075523 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.075799 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.075831 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.078084 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng"] Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.079758 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.217310 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.217405 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.217434 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.217474 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.217593 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.217653 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.217781 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsztn\" (UniqueName: \"kubernetes.io/projected/e5b69862-fd4c-4f01-977a-3d7f9bcce932-kube-api-access-nsztn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.218173 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.218214 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.321395 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.321920 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.321999 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.322045 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.322121 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.322182 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.322288 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsztn\" (UniqueName: \"kubernetes.io/projected/e5b69862-fd4c-4f01-977a-3d7f9bcce932-kube-api-access-nsztn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.322479 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.322515 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.323340 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.326951 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.327447 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.327447 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.328182 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.328238 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.328958 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.329574 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.338111 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsztn\" (UniqueName: \"kubernetes.io/projected/e5b69862-fd4c-4f01-977a-3d7f9bcce932-kube-api-access-nsztn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-txzng\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.388877 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:02:42 crc kubenswrapper[4594]: I1129 06:02:42.877417 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng"] Nov 29 06:02:43 crc kubenswrapper[4594]: I1129 06:02:43.004071 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" event={"ID":"e5b69862-fd4c-4f01-977a-3d7f9bcce932","Type":"ContainerStarted","Data":"64a6879edd3bd59242ee285abc23810f252d06f6e034a6e31677a2535446b057"} Nov 29 06:02:44 crc kubenswrapper[4594]: I1129 06:02:44.016666 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" event={"ID":"e5b69862-fd4c-4f01-977a-3d7f9bcce932","Type":"ContainerStarted","Data":"298dcfa4f8cdbec9b8d62177a54b2004112141a71f8424c8bb9d2f5a2991cfc2"} Nov 29 06:02:44 crc kubenswrapper[4594]: I1129 06:02:44.042517 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" podStartSLOduration=1.443978947 podStartE2EDuration="2.042497324s" podCreationTimestamp="2025-11-29 06:02:42 +0000 UTC" firstStartedPulling="2025-11-29 06:02:42.889924316 +0000 UTC m=+2087.130433536" lastFinishedPulling="2025-11-29 06:02:43.488442694 +0000 UTC m=+2087.728951913" observedRunningTime="2025-11-29 06:02:44.033406065 +0000 UTC m=+2088.273915286" watchObservedRunningTime="2025-11-29 06:02:44.042497324 +0000 UTC m=+2088.283006544" Nov 29 06:03:45 crc kubenswrapper[4594]: I1129 06:03:45.800587 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:03:45 crc kubenswrapper[4594]: I1129 06:03:45.801298 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:03:52 crc kubenswrapper[4594]: I1129 06:03:52.313651 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wmpkd"] Nov 29 06:03:52 crc kubenswrapper[4594]: I1129 06:03:52.316382 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:03:52 crc kubenswrapper[4594]: I1129 06:03:52.327029 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmpkd"] Nov 29 06:03:52 crc kubenswrapper[4594]: I1129 06:03:52.380916 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a06c67e-ef01-4230-8e8d-dbb3701dca49-catalog-content\") pod \"community-operators-wmpkd\" (UID: \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\") " pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:03:52 crc kubenswrapper[4594]: I1129 06:03:52.381021 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm9n9\" (UniqueName: \"kubernetes.io/projected/3a06c67e-ef01-4230-8e8d-dbb3701dca49-kube-api-access-pm9n9\") pod \"community-operators-wmpkd\" (UID: \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\") " pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:03:52 crc kubenswrapper[4594]: I1129 06:03:52.381249 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a06c67e-ef01-4230-8e8d-dbb3701dca49-utilities\") pod \"community-operators-wmpkd\" (UID: \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\") " pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:03:52 crc kubenswrapper[4594]: I1129 06:03:52.483630 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a06c67e-ef01-4230-8e8d-dbb3701dca49-catalog-content\") pod \"community-operators-wmpkd\" (UID: \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\") " pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:03:52 crc kubenswrapper[4594]: I1129 06:03:52.483707 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm9n9\" (UniqueName: \"kubernetes.io/projected/3a06c67e-ef01-4230-8e8d-dbb3701dca49-kube-api-access-pm9n9\") pod \"community-operators-wmpkd\" (UID: \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\") " pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:03:52 crc kubenswrapper[4594]: I1129 06:03:52.483772 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a06c67e-ef01-4230-8e8d-dbb3701dca49-utilities\") pod \"community-operators-wmpkd\" (UID: \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\") " pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:03:52 crc kubenswrapper[4594]: I1129 06:03:52.484215 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a06c67e-ef01-4230-8e8d-dbb3701dca49-utilities\") pod \"community-operators-wmpkd\" (UID: \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\") " pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:03:52 crc kubenswrapper[4594]: I1129 06:03:52.484226 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a06c67e-ef01-4230-8e8d-dbb3701dca49-catalog-content\") pod \"community-operators-wmpkd\" (UID: \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\") " pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:03:52 crc kubenswrapper[4594]: I1129 06:03:52.503324 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm9n9\" (UniqueName: \"kubernetes.io/projected/3a06c67e-ef01-4230-8e8d-dbb3701dca49-kube-api-access-pm9n9\") pod \"community-operators-wmpkd\" (UID: \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\") " pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:03:52 crc kubenswrapper[4594]: I1129 06:03:52.632976 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:03:53 crc kubenswrapper[4594]: I1129 06:03:53.143407 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmpkd"] Nov 29 06:03:53 crc kubenswrapper[4594]: I1129 06:03:53.676799 4594 generic.go:334] "Generic (PLEG): container finished" podID="3a06c67e-ef01-4230-8e8d-dbb3701dca49" containerID="f124024e1ce469e8bbd89b01b3301f05c54e1fd1a9dd3d7b520554bcacfdd870" exitCode=0 Nov 29 06:03:53 crc kubenswrapper[4594]: I1129 06:03:53.678177 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmpkd" event={"ID":"3a06c67e-ef01-4230-8e8d-dbb3701dca49","Type":"ContainerDied","Data":"f124024e1ce469e8bbd89b01b3301f05c54e1fd1a9dd3d7b520554bcacfdd870"} Nov 29 06:03:53 crc kubenswrapper[4594]: I1129 06:03:53.679034 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 06:03:53 crc kubenswrapper[4594]: I1129 06:03:53.679124 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmpkd" event={"ID":"3a06c67e-ef01-4230-8e8d-dbb3701dca49","Type":"ContainerStarted","Data":"0f5bc3d7eb32dcfa443efa72ed1d2dbefcfbb50651278d1a995ff9a4a8fd3764"} Nov 29 06:03:54 crc kubenswrapper[4594]: I1129 06:03:54.690585 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmpkd" event={"ID":"3a06c67e-ef01-4230-8e8d-dbb3701dca49","Type":"ContainerStarted","Data":"d99097053c58e9b64c4f7031eaad8e3ae9ee886e1a7b7b25ecfa508c8f26ed5e"} Nov 29 06:03:55 crc kubenswrapper[4594]: I1129 06:03:55.701118 4594 generic.go:334] "Generic (PLEG): container finished" podID="3a06c67e-ef01-4230-8e8d-dbb3701dca49" containerID="d99097053c58e9b64c4f7031eaad8e3ae9ee886e1a7b7b25ecfa508c8f26ed5e" exitCode=0 Nov 29 06:03:55 crc kubenswrapper[4594]: I1129 06:03:55.701178 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmpkd" event={"ID":"3a06c67e-ef01-4230-8e8d-dbb3701dca49","Type":"ContainerDied","Data":"d99097053c58e9b64c4f7031eaad8e3ae9ee886e1a7b7b25ecfa508c8f26ed5e"} Nov 29 06:03:56 crc kubenswrapper[4594]: I1129 06:03:56.714707 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmpkd" event={"ID":"3a06c67e-ef01-4230-8e8d-dbb3701dca49","Type":"ContainerStarted","Data":"a2a21ad3785e013a3234ccc915a769a232b5775db3d0d97d2e93375415ba53ae"} Nov 29 06:03:56 crc kubenswrapper[4594]: I1129 06:03:56.737506 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wmpkd" podStartSLOduration=2.195112167 podStartE2EDuration="4.737482333s" podCreationTimestamp="2025-11-29 06:03:52 +0000 UTC" firstStartedPulling="2025-11-29 06:03:53.678641607 +0000 UTC m=+2157.919150828" lastFinishedPulling="2025-11-29 06:03:56.221011774 +0000 UTC m=+2160.461520994" observedRunningTime="2025-11-29 06:03:56.729828959 +0000 UTC m=+2160.970338179" watchObservedRunningTime="2025-11-29 06:03:56.737482333 +0000 UTC m=+2160.977991553" Nov 29 06:04:02 crc kubenswrapper[4594]: I1129 06:04:02.634413 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:04:02 crc kubenswrapper[4594]: I1129 06:04:02.635063 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:04:02 crc kubenswrapper[4594]: I1129 06:04:02.677649 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:04:02 crc kubenswrapper[4594]: I1129 06:04:02.816647 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:04:02 crc kubenswrapper[4594]: I1129 06:04:02.923875 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmpkd"] Nov 29 06:04:04 crc kubenswrapper[4594]: I1129 06:04:04.796294 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wmpkd" podUID="3a06c67e-ef01-4230-8e8d-dbb3701dca49" containerName="registry-server" containerID="cri-o://a2a21ad3785e013a3234ccc915a769a232b5775db3d0d97d2e93375415ba53ae" gracePeriod=2 Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.258556 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.277547 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm9n9\" (UniqueName: \"kubernetes.io/projected/3a06c67e-ef01-4230-8e8d-dbb3701dca49-kube-api-access-pm9n9\") pod \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\" (UID: \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\") " Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.277652 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a06c67e-ef01-4230-8e8d-dbb3701dca49-utilities\") pod \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\" (UID: \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\") " Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.277925 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a06c67e-ef01-4230-8e8d-dbb3701dca49-catalog-content\") pod \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\" (UID: \"3a06c67e-ef01-4230-8e8d-dbb3701dca49\") " Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.278465 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a06c67e-ef01-4230-8e8d-dbb3701dca49-utilities" (OuterVolumeSpecName: "utilities") pod "3a06c67e-ef01-4230-8e8d-dbb3701dca49" (UID: "3a06c67e-ef01-4230-8e8d-dbb3701dca49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.278984 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a06c67e-ef01-4230-8e8d-dbb3701dca49-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.288397 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a06c67e-ef01-4230-8e8d-dbb3701dca49-kube-api-access-pm9n9" (OuterVolumeSpecName: "kube-api-access-pm9n9") pod "3a06c67e-ef01-4230-8e8d-dbb3701dca49" (UID: "3a06c67e-ef01-4230-8e8d-dbb3701dca49"). InnerVolumeSpecName "kube-api-access-pm9n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.324409 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a06c67e-ef01-4230-8e8d-dbb3701dca49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a06c67e-ef01-4230-8e8d-dbb3701dca49" (UID: "3a06c67e-ef01-4230-8e8d-dbb3701dca49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.380930 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a06c67e-ef01-4230-8e8d-dbb3701dca49-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.380967 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm9n9\" (UniqueName: \"kubernetes.io/projected/3a06c67e-ef01-4230-8e8d-dbb3701dca49-kube-api-access-pm9n9\") on node \"crc\" DevicePath \"\"" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.808510 4594 generic.go:334] "Generic (PLEG): container finished" podID="3a06c67e-ef01-4230-8e8d-dbb3701dca49" containerID="a2a21ad3785e013a3234ccc915a769a232b5775db3d0d97d2e93375415ba53ae" exitCode=0 Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.808569 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmpkd" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.808588 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmpkd" event={"ID":"3a06c67e-ef01-4230-8e8d-dbb3701dca49","Type":"ContainerDied","Data":"a2a21ad3785e013a3234ccc915a769a232b5775db3d0d97d2e93375415ba53ae"} Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.809658 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmpkd" event={"ID":"3a06c67e-ef01-4230-8e8d-dbb3701dca49","Type":"ContainerDied","Data":"0f5bc3d7eb32dcfa443efa72ed1d2dbefcfbb50651278d1a995ff9a4a8fd3764"} Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.809691 4594 scope.go:117] "RemoveContainer" containerID="a2a21ad3785e013a3234ccc915a769a232b5775db3d0d97d2e93375415ba53ae" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.838635 4594 scope.go:117] "RemoveContainer" containerID="d99097053c58e9b64c4f7031eaad8e3ae9ee886e1a7b7b25ecfa508c8f26ed5e" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.843163 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmpkd"] Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.860746 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wmpkd"] Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.866869 4594 scope.go:117] "RemoveContainer" containerID="f124024e1ce469e8bbd89b01b3301f05c54e1fd1a9dd3d7b520554bcacfdd870" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.897944 4594 scope.go:117] "RemoveContainer" containerID="a2a21ad3785e013a3234ccc915a769a232b5775db3d0d97d2e93375415ba53ae" Nov 29 06:04:05 crc kubenswrapper[4594]: E1129 06:04:05.898394 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2a21ad3785e013a3234ccc915a769a232b5775db3d0d97d2e93375415ba53ae\": container with ID starting with a2a21ad3785e013a3234ccc915a769a232b5775db3d0d97d2e93375415ba53ae not found: ID does not exist" containerID="a2a21ad3785e013a3234ccc915a769a232b5775db3d0d97d2e93375415ba53ae" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.898433 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2a21ad3785e013a3234ccc915a769a232b5775db3d0d97d2e93375415ba53ae"} err="failed to get container status \"a2a21ad3785e013a3234ccc915a769a232b5775db3d0d97d2e93375415ba53ae\": rpc error: code = NotFound desc = could not find container \"a2a21ad3785e013a3234ccc915a769a232b5775db3d0d97d2e93375415ba53ae\": container with ID starting with a2a21ad3785e013a3234ccc915a769a232b5775db3d0d97d2e93375415ba53ae not found: ID does not exist" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.898460 4594 scope.go:117] "RemoveContainer" containerID="d99097053c58e9b64c4f7031eaad8e3ae9ee886e1a7b7b25ecfa508c8f26ed5e" Nov 29 06:04:05 crc kubenswrapper[4594]: E1129 06:04:05.898753 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d99097053c58e9b64c4f7031eaad8e3ae9ee886e1a7b7b25ecfa508c8f26ed5e\": container with ID starting with d99097053c58e9b64c4f7031eaad8e3ae9ee886e1a7b7b25ecfa508c8f26ed5e not found: ID does not exist" containerID="d99097053c58e9b64c4f7031eaad8e3ae9ee886e1a7b7b25ecfa508c8f26ed5e" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.898814 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d99097053c58e9b64c4f7031eaad8e3ae9ee886e1a7b7b25ecfa508c8f26ed5e"} err="failed to get container status \"d99097053c58e9b64c4f7031eaad8e3ae9ee886e1a7b7b25ecfa508c8f26ed5e\": rpc error: code = NotFound desc = could not find container \"d99097053c58e9b64c4f7031eaad8e3ae9ee886e1a7b7b25ecfa508c8f26ed5e\": container with ID starting with d99097053c58e9b64c4f7031eaad8e3ae9ee886e1a7b7b25ecfa508c8f26ed5e not found: ID does not exist" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.898859 4594 scope.go:117] "RemoveContainer" containerID="f124024e1ce469e8bbd89b01b3301f05c54e1fd1a9dd3d7b520554bcacfdd870" Nov 29 06:04:05 crc kubenswrapper[4594]: E1129 06:04:05.899124 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f124024e1ce469e8bbd89b01b3301f05c54e1fd1a9dd3d7b520554bcacfdd870\": container with ID starting with f124024e1ce469e8bbd89b01b3301f05c54e1fd1a9dd3d7b520554bcacfdd870 not found: ID does not exist" containerID="f124024e1ce469e8bbd89b01b3301f05c54e1fd1a9dd3d7b520554bcacfdd870" Nov 29 06:04:05 crc kubenswrapper[4594]: I1129 06:04:05.899144 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f124024e1ce469e8bbd89b01b3301f05c54e1fd1a9dd3d7b520554bcacfdd870"} err="failed to get container status \"f124024e1ce469e8bbd89b01b3301f05c54e1fd1a9dd3d7b520554bcacfdd870\": rpc error: code = NotFound desc = could not find container \"f124024e1ce469e8bbd89b01b3301f05c54e1fd1a9dd3d7b520554bcacfdd870\": container with ID starting with f124024e1ce469e8bbd89b01b3301f05c54e1fd1a9dd3d7b520554bcacfdd870 not found: ID does not exist" Nov 29 06:04:06 crc kubenswrapper[4594]: I1129 06:04:06.095675 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a06c67e-ef01-4230-8e8d-dbb3701dca49" path="/var/lib/kubelet/pods/3a06c67e-ef01-4230-8e8d-dbb3701dca49/volumes" Nov 29 06:04:15 crc kubenswrapper[4594]: I1129 06:04:15.800474 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:04:15 crc kubenswrapper[4594]: I1129 06:04:15.801080 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:04:45 crc kubenswrapper[4594]: I1129 06:04:45.800294 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:04:45 crc kubenswrapper[4594]: I1129 06:04:45.800996 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:04:45 crc kubenswrapper[4594]: I1129 06:04:45.801054 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 06:04:45 crc kubenswrapper[4594]: I1129 06:04:45.801702 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e18bd53322063d829249c7999cf4c97247a11382fee69b79893aa2fbd67ae57"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 06:04:45 crc kubenswrapper[4594]: I1129 06:04:45.801777 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://9e18bd53322063d829249c7999cf4c97247a11382fee69b79893aa2fbd67ae57" gracePeriod=600 Nov 29 06:04:46 crc kubenswrapper[4594]: I1129 06:04:46.215215 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="9e18bd53322063d829249c7999cf4c97247a11382fee69b79893aa2fbd67ae57" exitCode=0 Nov 29 06:04:46 crc kubenswrapper[4594]: I1129 06:04:46.215279 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"9e18bd53322063d829249c7999cf4c97247a11382fee69b79893aa2fbd67ae57"} Nov 29 06:04:46 crc kubenswrapper[4594]: I1129 06:04:46.215639 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9"} Nov 29 06:04:46 crc kubenswrapper[4594]: I1129 06:04:46.215659 4594 scope.go:117] "RemoveContainer" containerID="112625db7a9e2602ec481072051add75f1091c3a06e6a9540b152d2e23c5722f" Nov 29 06:04:46 crc kubenswrapper[4594]: I1129 06:04:46.217597 4594 generic.go:334] "Generic (PLEG): container finished" podID="e5b69862-fd4c-4f01-977a-3d7f9bcce932" containerID="298dcfa4f8cdbec9b8d62177a54b2004112141a71f8424c8bb9d2f5a2991cfc2" exitCode=0 Nov 29 06:04:46 crc kubenswrapper[4594]: I1129 06:04:46.217629 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" event={"ID":"e5b69862-fd4c-4f01-977a-3d7f9bcce932","Type":"ContainerDied","Data":"298dcfa4f8cdbec9b8d62177a54b2004112141a71f8424c8bb9d2f5a2991cfc2"} Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.624953 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.804612 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsztn\" (UniqueName: \"kubernetes.io/projected/e5b69862-fd4c-4f01-977a-3d7f9bcce932-kube-api-access-nsztn\") pod \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.805042 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-cell1-compute-config-1\") pod \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.805209 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-ssh-key\") pod \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.805317 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-combined-ca-bundle\") pod \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.805659 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-cell1-compute-config-0\") pod \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.805695 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-migration-ssh-key-0\") pod \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.805752 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-migration-ssh-key-1\") pod \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.805828 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-extra-config-0\") pod \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.805846 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-inventory\") pod \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\" (UID: \"e5b69862-fd4c-4f01-977a-3d7f9bcce932\") " Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.817433 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e5b69862-fd4c-4f01-977a-3d7f9bcce932" (UID: "e5b69862-fd4c-4f01-977a-3d7f9bcce932"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.817678 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5b69862-fd4c-4f01-977a-3d7f9bcce932-kube-api-access-nsztn" (OuterVolumeSpecName: "kube-api-access-nsztn") pod "e5b69862-fd4c-4f01-977a-3d7f9bcce932" (UID: "e5b69862-fd4c-4f01-977a-3d7f9bcce932"). InnerVolumeSpecName "kube-api-access-nsztn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.834178 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e5b69862-fd4c-4f01-977a-3d7f9bcce932" (UID: "e5b69862-fd4c-4f01-977a-3d7f9bcce932"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.834594 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "e5b69862-fd4c-4f01-977a-3d7f9bcce932" (UID: "e5b69862-fd4c-4f01-977a-3d7f9bcce932"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.834690 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e5b69862-fd4c-4f01-977a-3d7f9bcce932" (UID: "e5b69862-fd4c-4f01-977a-3d7f9bcce932"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.836585 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e5b69862-fd4c-4f01-977a-3d7f9bcce932" (UID: "e5b69862-fd4c-4f01-977a-3d7f9bcce932"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.844363 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e5b69862-fd4c-4f01-977a-3d7f9bcce932" (UID: "e5b69862-fd4c-4f01-977a-3d7f9bcce932"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.846489 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e5b69862-fd4c-4f01-977a-3d7f9bcce932" (UID: "e5b69862-fd4c-4f01-977a-3d7f9bcce932"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.855159 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-inventory" (OuterVolumeSpecName: "inventory") pod "e5b69862-fd4c-4f01-977a-3d7f9bcce932" (UID: "e5b69862-fd4c-4f01-977a-3d7f9bcce932"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.908894 4594 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.908932 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.908946 4594 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.908956 4594 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.908967 4594 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.908976 4594 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.908987 4594 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e5b69862-fd4c-4f01-977a-3d7f9bcce932-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.908996 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5b69862-fd4c-4f01-977a-3d7f9bcce932-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 06:04:47 crc kubenswrapper[4594]: I1129 06:04:47.909007 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsztn\" (UniqueName: \"kubernetes.io/projected/e5b69862-fd4c-4f01-977a-3d7f9bcce932-kube-api-access-nsztn\") on node \"crc\" DevicePath \"\"" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.237158 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" event={"ID":"e5b69862-fd4c-4f01-977a-3d7f9bcce932","Type":"ContainerDied","Data":"64a6879edd3bd59242ee285abc23810f252d06f6e034a6e31677a2535446b057"} Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.237550 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64a6879edd3bd59242ee285abc23810f252d06f6e034a6e31677a2535446b057" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.237284 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-txzng" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.331328 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg"] Nov 29 06:04:48 crc kubenswrapper[4594]: E1129 06:04:48.331818 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b69862-fd4c-4f01-977a-3d7f9bcce932" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.331884 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b69862-fd4c-4f01-977a-3d7f9bcce932" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 29 06:04:48 crc kubenswrapper[4594]: E1129 06:04:48.331954 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a06c67e-ef01-4230-8e8d-dbb3701dca49" containerName="registry-server" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.332014 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a06c67e-ef01-4230-8e8d-dbb3701dca49" containerName="registry-server" Nov 29 06:04:48 crc kubenswrapper[4594]: E1129 06:04:48.332089 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a06c67e-ef01-4230-8e8d-dbb3701dca49" containerName="extract-utilities" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.332146 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a06c67e-ef01-4230-8e8d-dbb3701dca49" containerName="extract-utilities" Nov 29 06:04:48 crc kubenswrapper[4594]: E1129 06:04:48.332193 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a06c67e-ef01-4230-8e8d-dbb3701dca49" containerName="extract-content" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.332244 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a06c67e-ef01-4230-8e8d-dbb3701dca49" containerName="extract-content" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.332564 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5b69862-fd4c-4f01-977a-3d7f9bcce932" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.332641 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a06c67e-ef01-4230-8e8d-dbb3701dca49" containerName="registry-server" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.333306 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.335826 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.335999 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.336303 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b89h2" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.336464 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.338449 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.351616 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg"] Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.422874 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.423007 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.423146 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.423301 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.423454 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhwhc\" (UniqueName: \"kubernetes.io/projected/2e564f32-c761-4816-9715-7636294bd4c4-kube-api-access-hhwhc\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.423617 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.423736 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.524942 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.525073 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.525249 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.525389 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.525510 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.525648 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.526307 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhwhc\" (UniqueName: \"kubernetes.io/projected/2e564f32-c761-4816-9715-7636294bd4c4-kube-api-access-hhwhc\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.529828 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.529929 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.530461 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.530798 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.530893 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.531893 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.540752 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhwhc\" (UniqueName: \"kubernetes.io/projected/2e564f32-c761-4816-9715-7636294bd4c4-kube-api-access-hhwhc\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:48 crc kubenswrapper[4594]: I1129 06:04:48.647342 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:04:49 crc kubenswrapper[4594]: I1129 06:04:49.108044 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg"] Nov 29 06:04:49 crc kubenswrapper[4594]: W1129 06:04:49.109060 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e564f32_c761_4816_9715_7636294bd4c4.slice/crio-60dea1053c0b4c5706256c719a1a31d828e66a6bdad01d7d2f62290115e8de07 WatchSource:0}: Error finding container 60dea1053c0b4c5706256c719a1a31d828e66a6bdad01d7d2f62290115e8de07: Status 404 returned error can't find the container with id 60dea1053c0b4c5706256c719a1a31d828e66a6bdad01d7d2f62290115e8de07 Nov 29 06:04:49 crc kubenswrapper[4594]: I1129 06:04:49.245965 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" event={"ID":"2e564f32-c761-4816-9715-7636294bd4c4","Type":"ContainerStarted","Data":"60dea1053c0b4c5706256c719a1a31d828e66a6bdad01d7d2f62290115e8de07"} Nov 29 06:04:50 crc kubenswrapper[4594]: I1129 06:04:50.255627 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" event={"ID":"2e564f32-c761-4816-9715-7636294bd4c4","Type":"ContainerStarted","Data":"1498c5ebabddd636315cb3709dbb6802f64c16bf7c7cfb4b606a383927f56084"} Nov 29 06:04:50 crc kubenswrapper[4594]: I1129 06:04:50.271924 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" podStartSLOduration=1.325376769 podStartE2EDuration="2.271909271s" podCreationTimestamp="2025-11-29 06:04:48 +0000 UTC" firstStartedPulling="2025-11-29 06:04:49.111362234 +0000 UTC m=+2213.351871453" lastFinishedPulling="2025-11-29 06:04:50.057894745 +0000 UTC m=+2214.298403955" observedRunningTime="2025-11-29 06:04:50.267871436 +0000 UTC m=+2214.508380656" watchObservedRunningTime="2025-11-29 06:04:50.271909271 +0000 UTC m=+2214.512418491" Nov 29 06:06:22 crc kubenswrapper[4594]: I1129 06:06:22.150296 4594 generic.go:334] "Generic (PLEG): container finished" podID="2e564f32-c761-4816-9715-7636294bd4c4" containerID="1498c5ebabddd636315cb3709dbb6802f64c16bf7c7cfb4b606a383927f56084" exitCode=0 Nov 29 06:06:22 crc kubenswrapper[4594]: I1129 06:06:22.150363 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" event={"ID":"2e564f32-c761-4816-9715-7636294bd4c4","Type":"ContainerDied","Data":"1498c5ebabddd636315cb3709dbb6802f64c16bf7c7cfb4b606a383927f56084"} Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.506885 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.560232 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-telemetry-combined-ca-bundle\") pod \"2e564f32-c761-4816-9715-7636294bd4c4\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.560318 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-0\") pod \"2e564f32-c761-4816-9715-7636294bd4c4\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.560523 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-2\") pod \"2e564f32-c761-4816-9715-7636294bd4c4\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.560582 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhwhc\" (UniqueName: \"kubernetes.io/projected/2e564f32-c761-4816-9715-7636294bd4c4-kube-api-access-hhwhc\") pod \"2e564f32-c761-4816-9715-7636294bd4c4\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.560647 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-1\") pod \"2e564f32-c761-4816-9715-7636294bd4c4\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.561373 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ssh-key\") pod \"2e564f32-c761-4816-9715-7636294bd4c4\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.561644 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-inventory\") pod \"2e564f32-c761-4816-9715-7636294bd4c4\" (UID: \"2e564f32-c761-4816-9715-7636294bd4c4\") " Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.568546 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2e564f32-c761-4816-9715-7636294bd4c4" (UID: "2e564f32-c761-4816-9715-7636294bd4c4"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.568545 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e564f32-c761-4816-9715-7636294bd4c4-kube-api-access-hhwhc" (OuterVolumeSpecName: "kube-api-access-hhwhc") pod "2e564f32-c761-4816-9715-7636294bd4c4" (UID: "2e564f32-c761-4816-9715-7636294bd4c4"). InnerVolumeSpecName "kube-api-access-hhwhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.592574 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "2e564f32-c761-4816-9715-7636294bd4c4" (UID: "2e564f32-c761-4816-9715-7636294bd4c4"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.593111 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "2e564f32-c761-4816-9715-7636294bd4c4" (UID: "2e564f32-c761-4816-9715-7636294bd4c4"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.595023 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-inventory" (OuterVolumeSpecName: "inventory") pod "2e564f32-c761-4816-9715-7636294bd4c4" (UID: "2e564f32-c761-4816-9715-7636294bd4c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.596867 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "2e564f32-c761-4816-9715-7636294bd4c4" (UID: "2e564f32-c761-4816-9715-7636294bd4c4"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.597345 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2e564f32-c761-4816-9715-7636294bd4c4" (UID: "2e564f32-c761-4816-9715-7636294bd4c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.666506 4594 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.666541 4594 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.666556 4594 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.666570 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhwhc\" (UniqueName: \"kubernetes.io/projected/2e564f32-c761-4816-9715-7636294bd4c4-kube-api-access-hhwhc\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.666582 4594 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.666592 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:23 crc kubenswrapper[4594]: I1129 06:06:23.666601 4594 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e564f32-c761-4816-9715-7636294bd4c4-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:24 crc kubenswrapper[4594]: I1129 06:06:24.173880 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" event={"ID":"2e564f32-c761-4816-9715-7636294bd4c4","Type":"ContainerDied","Data":"60dea1053c0b4c5706256c719a1a31d828e66a6bdad01d7d2f62290115e8de07"} Nov 29 06:06:24 crc kubenswrapper[4594]: I1129 06:06:24.174453 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60dea1053c0b4c5706256c719a1a31d828e66a6bdad01d7d2f62290115e8de07" Nov 29 06:06:24 crc kubenswrapper[4594]: I1129 06:06:24.173917 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.174470 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q2v5m"] Nov 29 06:06:31 crc kubenswrapper[4594]: E1129 06:06:31.175429 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e564f32-c761-4816-9715-7636294bd4c4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.175444 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e564f32-c761-4816-9715-7636294bd4c4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.175695 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e564f32-c761-4816-9715-7636294bd4c4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.177551 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.184961 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q2v5m"] Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.230407 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9ce76d3-605e-421a-988e-27cb1040252d-utilities\") pod \"redhat-operators-q2v5m\" (UID: \"b9ce76d3-605e-421a-988e-27cb1040252d\") " pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.230498 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9ce76d3-605e-421a-988e-27cb1040252d-catalog-content\") pod \"redhat-operators-q2v5m\" (UID: \"b9ce76d3-605e-421a-988e-27cb1040252d\") " pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.230695 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5ml9\" (UniqueName: \"kubernetes.io/projected/b9ce76d3-605e-421a-988e-27cb1040252d-kube-api-access-d5ml9\") pod \"redhat-operators-q2v5m\" (UID: \"b9ce76d3-605e-421a-988e-27cb1040252d\") " pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.332830 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9ce76d3-605e-421a-988e-27cb1040252d-utilities\") pod \"redhat-operators-q2v5m\" (UID: \"b9ce76d3-605e-421a-988e-27cb1040252d\") " pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.332917 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9ce76d3-605e-421a-988e-27cb1040252d-catalog-content\") pod \"redhat-operators-q2v5m\" (UID: \"b9ce76d3-605e-421a-988e-27cb1040252d\") " pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.332992 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5ml9\" (UniqueName: \"kubernetes.io/projected/b9ce76d3-605e-421a-988e-27cb1040252d-kube-api-access-d5ml9\") pod \"redhat-operators-q2v5m\" (UID: \"b9ce76d3-605e-421a-988e-27cb1040252d\") " pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.333639 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9ce76d3-605e-421a-988e-27cb1040252d-utilities\") pod \"redhat-operators-q2v5m\" (UID: \"b9ce76d3-605e-421a-988e-27cb1040252d\") " pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.333888 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9ce76d3-605e-421a-988e-27cb1040252d-catalog-content\") pod \"redhat-operators-q2v5m\" (UID: \"b9ce76d3-605e-421a-988e-27cb1040252d\") " pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.353271 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5ml9\" (UniqueName: \"kubernetes.io/projected/b9ce76d3-605e-421a-988e-27cb1040252d-kube-api-access-d5ml9\") pod \"redhat-operators-q2v5m\" (UID: \"b9ce76d3-605e-421a-988e-27cb1040252d\") " pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.377172 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7bs88"] Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.382481 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.389047 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bs88"] Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.435914 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-utilities\") pod \"certified-operators-7bs88\" (UID: \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\") " pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.436024 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n28v5\" (UniqueName: \"kubernetes.io/projected/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-kube-api-access-n28v5\") pod \"certified-operators-7bs88\" (UID: \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\") " pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.436123 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-catalog-content\") pod \"certified-operators-7bs88\" (UID: \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\") " pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.497562 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.538054 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-utilities\") pod \"certified-operators-7bs88\" (UID: \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\") " pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.538140 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n28v5\" (UniqueName: \"kubernetes.io/projected/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-kube-api-access-n28v5\") pod \"certified-operators-7bs88\" (UID: \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\") " pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.538224 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-catalog-content\") pod \"certified-operators-7bs88\" (UID: \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\") " pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.538658 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-utilities\") pod \"certified-operators-7bs88\" (UID: \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\") " pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.538784 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-catalog-content\") pod \"certified-operators-7bs88\" (UID: \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\") " pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.555038 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n28v5\" (UniqueName: \"kubernetes.io/projected/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-kube-api-access-n28v5\") pod \"certified-operators-7bs88\" (UID: \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\") " pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.721516 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:31 crc kubenswrapper[4594]: I1129 06:06:31.946628 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q2v5m"] Nov 29 06:06:32 crc kubenswrapper[4594]: I1129 06:06:32.014093 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bs88"] Nov 29 06:06:32 crc kubenswrapper[4594]: I1129 06:06:32.249452 4594 generic.go:334] "Generic (PLEG): container finished" podID="b9ce76d3-605e-421a-988e-27cb1040252d" containerID="bc82859256969c9357343a5e18001e96a48fee923b4c7ecf815c5f3ec200bba7" exitCode=0 Nov 29 06:06:32 crc kubenswrapper[4594]: I1129 06:06:32.249585 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2v5m" event={"ID":"b9ce76d3-605e-421a-988e-27cb1040252d","Type":"ContainerDied","Data":"bc82859256969c9357343a5e18001e96a48fee923b4c7ecf815c5f3ec200bba7"} Nov 29 06:06:32 crc kubenswrapper[4594]: I1129 06:06:32.249831 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2v5m" event={"ID":"b9ce76d3-605e-421a-988e-27cb1040252d","Type":"ContainerStarted","Data":"b17eaa7209c2bd2c748962a781eeeea42b123a72fc474e88041b7bf18a4c05d4"} Nov 29 06:06:32 crc kubenswrapper[4594]: I1129 06:06:32.251418 4594 generic.go:334] "Generic (PLEG): container finished" podID="e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" containerID="60d0a6954e69e15b4214cfd72933ab9d95daa00228e47393588b191d5c94ceee" exitCode=0 Nov 29 06:06:32 crc kubenswrapper[4594]: I1129 06:06:32.251473 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bs88" event={"ID":"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9","Type":"ContainerDied","Data":"60d0a6954e69e15b4214cfd72933ab9d95daa00228e47393588b191d5c94ceee"} Nov 29 06:06:32 crc kubenswrapper[4594]: I1129 06:06:32.251504 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bs88" event={"ID":"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9","Type":"ContainerStarted","Data":"2991fe311f9aba723695bf6ad8c4e1a9b6fb6538ee1cdafd1009726d946a10d8"} Nov 29 06:06:34 crc kubenswrapper[4594]: I1129 06:06:34.287102 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2v5m" event={"ID":"b9ce76d3-605e-421a-988e-27cb1040252d","Type":"ContainerStarted","Data":"158fc51a173ef5f6ad1b34cd6ba30ce2f7418c08c4b6ba4408baa0bc902e51cd"} Nov 29 06:06:34 crc kubenswrapper[4594]: I1129 06:06:34.289671 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bs88" event={"ID":"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9","Type":"ContainerStarted","Data":"d8040ffb4451e1c6205c7637abfc236f45463c024b065bba99da49fabdfc1f4f"} Nov 29 06:06:35 crc kubenswrapper[4594]: I1129 06:06:35.305228 4594 generic.go:334] "Generic (PLEG): container finished" podID="e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" containerID="d8040ffb4451e1c6205c7637abfc236f45463c024b065bba99da49fabdfc1f4f" exitCode=0 Nov 29 06:06:35 crc kubenswrapper[4594]: I1129 06:06:35.305322 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bs88" event={"ID":"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9","Type":"ContainerDied","Data":"d8040ffb4451e1c6205c7637abfc236f45463c024b065bba99da49fabdfc1f4f"} Nov 29 06:06:36 crc kubenswrapper[4594]: I1129 06:06:36.321776 4594 generic.go:334] "Generic (PLEG): container finished" podID="b9ce76d3-605e-421a-988e-27cb1040252d" containerID="158fc51a173ef5f6ad1b34cd6ba30ce2f7418c08c4b6ba4408baa0bc902e51cd" exitCode=0 Nov 29 06:06:36 crc kubenswrapper[4594]: I1129 06:06:36.321884 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2v5m" event={"ID":"b9ce76d3-605e-421a-988e-27cb1040252d","Type":"ContainerDied","Data":"158fc51a173ef5f6ad1b34cd6ba30ce2f7418c08c4b6ba4408baa0bc902e51cd"} Nov 29 06:06:37 crc kubenswrapper[4594]: I1129 06:06:37.339420 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2v5m" event={"ID":"b9ce76d3-605e-421a-988e-27cb1040252d","Type":"ContainerStarted","Data":"1667d485be30c5588ddc6c4d063ed8ab5065d1026dc4eb57036b2cc373cff165"} Nov 29 06:06:37 crc kubenswrapper[4594]: I1129 06:06:37.342562 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bs88" event={"ID":"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9","Type":"ContainerStarted","Data":"5d8f309f058a1da757c322350d4b55693098e9741d886fd7c7d444a971860a12"} Nov 29 06:06:37 crc kubenswrapper[4594]: I1129 06:06:37.367031 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q2v5m" podStartSLOduration=1.759555618 podStartE2EDuration="6.36701452s" podCreationTimestamp="2025-11-29 06:06:31 +0000 UTC" firstStartedPulling="2025-11-29 06:06:32.250934633 +0000 UTC m=+2316.491443853" lastFinishedPulling="2025-11-29 06:06:36.858393535 +0000 UTC m=+2321.098902755" observedRunningTime="2025-11-29 06:06:37.354949508 +0000 UTC m=+2321.595458728" watchObservedRunningTime="2025-11-29 06:06:37.36701452 +0000 UTC m=+2321.607523740" Nov 29 06:06:37 crc kubenswrapper[4594]: I1129 06:06:37.384072 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7bs88" podStartSLOduration=2.064656938 podStartE2EDuration="6.384062103s" podCreationTimestamp="2025-11-29 06:06:31 +0000 UTC" firstStartedPulling="2025-11-29 06:06:32.253561773 +0000 UTC m=+2316.494070994" lastFinishedPulling="2025-11-29 06:06:36.572966939 +0000 UTC m=+2320.813476159" observedRunningTime="2025-11-29 06:06:37.371979969 +0000 UTC m=+2321.612489189" watchObservedRunningTime="2025-11-29 06:06:37.384062103 +0000 UTC m=+2321.624571323" Nov 29 06:06:41 crc kubenswrapper[4594]: I1129 06:06:41.498276 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:41 crc kubenswrapper[4594]: I1129 06:06:41.498774 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:41 crc kubenswrapper[4594]: I1129 06:06:41.721612 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:41 crc kubenswrapper[4594]: I1129 06:06:41.721660 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:41 crc kubenswrapper[4594]: I1129 06:06:41.764690 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:42 crc kubenswrapper[4594]: I1129 06:06:42.439823 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:42 crc kubenswrapper[4594]: I1129 06:06:42.538948 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q2v5m" podUID="b9ce76d3-605e-421a-988e-27cb1040252d" containerName="registry-server" probeResult="failure" output=< Nov 29 06:06:42 crc kubenswrapper[4594]: timeout: failed to connect service ":50051" within 1s Nov 29 06:06:42 crc kubenswrapper[4594]: > Nov 29 06:06:43 crc kubenswrapper[4594]: I1129 06:06:43.563204 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7bs88"] Nov 29 06:06:44 crc kubenswrapper[4594]: I1129 06:06:44.414282 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7bs88" podUID="e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" containerName="registry-server" containerID="cri-o://5d8f309f058a1da757c322350d4b55693098e9741d886fd7c7d444a971860a12" gracePeriod=2 Nov 29 06:06:44 crc kubenswrapper[4594]: I1129 06:06:44.817814 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:44 crc kubenswrapper[4594]: I1129 06:06:44.858202 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n28v5\" (UniqueName: \"kubernetes.io/projected/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-kube-api-access-n28v5\") pod \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\" (UID: \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\") " Nov 29 06:06:44 crc kubenswrapper[4594]: I1129 06:06:44.858408 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-utilities\") pod \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\" (UID: \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\") " Nov 29 06:06:44 crc kubenswrapper[4594]: I1129 06:06:44.858988 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-utilities" (OuterVolumeSpecName: "utilities") pod "e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" (UID: "e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:06:44 crc kubenswrapper[4594]: I1129 06:06:44.859112 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-catalog-content\") pod \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\" (UID: \"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9\") " Nov 29 06:06:44 crc kubenswrapper[4594]: I1129 06:06:44.867401 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-kube-api-access-n28v5" (OuterVolumeSpecName: "kube-api-access-n28v5") pod "e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" (UID: "e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9"). InnerVolumeSpecName "kube-api-access-n28v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:06:44 crc kubenswrapper[4594]: I1129 06:06:44.910887 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n28v5\" (UniqueName: \"kubernetes.io/projected/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-kube-api-access-n28v5\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:44 crc kubenswrapper[4594]: I1129 06:06:44.910909 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:44 crc kubenswrapper[4594]: I1129 06:06:44.944054 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" (UID: "e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.013144 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.424888 4594 generic.go:334] "Generic (PLEG): container finished" podID="e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" containerID="5d8f309f058a1da757c322350d4b55693098e9741d886fd7c7d444a971860a12" exitCode=0 Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.424928 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bs88" event={"ID":"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9","Type":"ContainerDied","Data":"5d8f309f058a1da757c322350d4b55693098e9741d886fd7c7d444a971860a12"} Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.424957 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bs88" event={"ID":"e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9","Type":"ContainerDied","Data":"2991fe311f9aba723695bf6ad8c4e1a9b6fb6538ee1cdafd1009726d946a10d8"} Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.424979 4594 scope.go:117] "RemoveContainer" containerID="5d8f309f058a1da757c322350d4b55693098e9741d886fd7c7d444a971860a12" Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.424988 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bs88" Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.449632 4594 scope.go:117] "RemoveContainer" containerID="d8040ffb4451e1c6205c7637abfc236f45463c024b065bba99da49fabdfc1f4f" Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.462691 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7bs88"] Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.473384 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7bs88"] Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.497205 4594 scope.go:117] "RemoveContainer" containerID="60d0a6954e69e15b4214cfd72933ab9d95daa00228e47393588b191d5c94ceee" Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.515295 4594 scope.go:117] "RemoveContainer" containerID="5d8f309f058a1da757c322350d4b55693098e9741d886fd7c7d444a971860a12" Nov 29 06:06:45 crc kubenswrapper[4594]: E1129 06:06:45.515724 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d8f309f058a1da757c322350d4b55693098e9741d886fd7c7d444a971860a12\": container with ID starting with 5d8f309f058a1da757c322350d4b55693098e9741d886fd7c7d444a971860a12 not found: ID does not exist" containerID="5d8f309f058a1da757c322350d4b55693098e9741d886fd7c7d444a971860a12" Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.515752 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d8f309f058a1da757c322350d4b55693098e9741d886fd7c7d444a971860a12"} err="failed to get container status \"5d8f309f058a1da757c322350d4b55693098e9741d886fd7c7d444a971860a12\": rpc error: code = NotFound desc = could not find container \"5d8f309f058a1da757c322350d4b55693098e9741d886fd7c7d444a971860a12\": container with ID starting with 5d8f309f058a1da757c322350d4b55693098e9741d886fd7c7d444a971860a12 not found: ID does not exist" Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.515774 4594 scope.go:117] "RemoveContainer" containerID="d8040ffb4451e1c6205c7637abfc236f45463c024b065bba99da49fabdfc1f4f" Nov 29 06:06:45 crc kubenswrapper[4594]: E1129 06:06:45.516068 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8040ffb4451e1c6205c7637abfc236f45463c024b065bba99da49fabdfc1f4f\": container with ID starting with d8040ffb4451e1c6205c7637abfc236f45463c024b065bba99da49fabdfc1f4f not found: ID does not exist" containerID="d8040ffb4451e1c6205c7637abfc236f45463c024b065bba99da49fabdfc1f4f" Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.516090 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8040ffb4451e1c6205c7637abfc236f45463c024b065bba99da49fabdfc1f4f"} err="failed to get container status \"d8040ffb4451e1c6205c7637abfc236f45463c024b065bba99da49fabdfc1f4f\": rpc error: code = NotFound desc = could not find container \"d8040ffb4451e1c6205c7637abfc236f45463c024b065bba99da49fabdfc1f4f\": container with ID starting with d8040ffb4451e1c6205c7637abfc236f45463c024b065bba99da49fabdfc1f4f not found: ID does not exist" Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.516105 4594 scope.go:117] "RemoveContainer" containerID="60d0a6954e69e15b4214cfd72933ab9d95daa00228e47393588b191d5c94ceee" Nov 29 06:06:45 crc kubenswrapper[4594]: E1129 06:06:45.516357 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60d0a6954e69e15b4214cfd72933ab9d95daa00228e47393588b191d5c94ceee\": container with ID starting with 60d0a6954e69e15b4214cfd72933ab9d95daa00228e47393588b191d5c94ceee not found: ID does not exist" containerID="60d0a6954e69e15b4214cfd72933ab9d95daa00228e47393588b191d5c94ceee" Nov 29 06:06:45 crc kubenswrapper[4594]: I1129 06:06:45.516376 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d0a6954e69e15b4214cfd72933ab9d95daa00228e47393588b191d5c94ceee"} err="failed to get container status \"60d0a6954e69e15b4214cfd72933ab9d95daa00228e47393588b191d5c94ceee\": rpc error: code = NotFound desc = could not find container \"60d0a6954e69e15b4214cfd72933ab9d95daa00228e47393588b191d5c94ceee\": container with ID starting with 60d0a6954e69e15b4214cfd72933ab9d95daa00228e47393588b191d5c94ceee not found: ID does not exist" Nov 29 06:06:46 crc kubenswrapper[4594]: I1129 06:06:46.095759 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" path="/var/lib/kubelet/pods/e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9/volumes" Nov 29 06:06:51 crc kubenswrapper[4594]: I1129 06:06:51.541973 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:51 crc kubenswrapper[4594]: I1129 06:06:51.583277 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:51 crc kubenswrapper[4594]: I1129 06:06:51.781768 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q2v5m"] Nov 29 06:06:53 crc kubenswrapper[4594]: I1129 06:06:53.503472 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q2v5m" podUID="b9ce76d3-605e-421a-988e-27cb1040252d" containerName="registry-server" containerID="cri-o://1667d485be30c5588ddc6c4d063ed8ab5065d1026dc4eb57036b2cc373cff165" gracePeriod=2 Nov 29 06:06:53 crc kubenswrapper[4594]: I1129 06:06:53.927806 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.035344 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9ce76d3-605e-421a-988e-27cb1040252d-catalog-content\") pod \"b9ce76d3-605e-421a-988e-27cb1040252d\" (UID: \"b9ce76d3-605e-421a-988e-27cb1040252d\") " Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.035397 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9ce76d3-605e-421a-988e-27cb1040252d-utilities\") pod \"b9ce76d3-605e-421a-988e-27cb1040252d\" (UID: \"b9ce76d3-605e-421a-988e-27cb1040252d\") " Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.035524 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5ml9\" (UniqueName: \"kubernetes.io/projected/b9ce76d3-605e-421a-988e-27cb1040252d-kube-api-access-d5ml9\") pod \"b9ce76d3-605e-421a-988e-27cb1040252d\" (UID: \"b9ce76d3-605e-421a-988e-27cb1040252d\") " Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.036172 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9ce76d3-605e-421a-988e-27cb1040252d-utilities" (OuterVolumeSpecName: "utilities") pod "b9ce76d3-605e-421a-988e-27cb1040252d" (UID: "b9ce76d3-605e-421a-988e-27cb1040252d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.036360 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9ce76d3-605e-421a-988e-27cb1040252d-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.041763 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ce76d3-605e-421a-988e-27cb1040252d-kube-api-access-d5ml9" (OuterVolumeSpecName: "kube-api-access-d5ml9") pod "b9ce76d3-605e-421a-988e-27cb1040252d" (UID: "b9ce76d3-605e-421a-988e-27cb1040252d"). InnerVolumeSpecName "kube-api-access-d5ml9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.141717 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5ml9\" (UniqueName: \"kubernetes.io/projected/b9ce76d3-605e-421a-988e-27cb1040252d-kube-api-access-d5ml9\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.141854 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9ce76d3-605e-421a-988e-27cb1040252d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9ce76d3-605e-421a-988e-27cb1040252d" (UID: "b9ce76d3-605e-421a-988e-27cb1040252d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.244788 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9ce76d3-605e-421a-988e-27cb1040252d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.514375 4594 generic.go:334] "Generic (PLEG): container finished" podID="b9ce76d3-605e-421a-988e-27cb1040252d" containerID="1667d485be30c5588ddc6c4d063ed8ab5065d1026dc4eb57036b2cc373cff165" exitCode=0 Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.514536 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2v5m" event={"ID":"b9ce76d3-605e-421a-988e-27cb1040252d","Type":"ContainerDied","Data":"1667d485be30c5588ddc6c4d063ed8ab5065d1026dc4eb57036b2cc373cff165"} Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.515247 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2v5m" event={"ID":"b9ce76d3-605e-421a-988e-27cb1040252d","Type":"ContainerDied","Data":"b17eaa7209c2bd2c748962a781eeeea42b123a72fc474e88041b7bf18a4c05d4"} Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.514675 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2v5m" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.515349 4594 scope.go:117] "RemoveContainer" containerID="1667d485be30c5588ddc6c4d063ed8ab5065d1026dc4eb57036b2cc373cff165" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.539771 4594 scope.go:117] "RemoveContainer" containerID="158fc51a173ef5f6ad1b34cd6ba30ce2f7418c08c4b6ba4408baa0bc902e51cd" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.558603 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q2v5m"] Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.567869 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q2v5m"] Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.580453 4594 scope.go:117] "RemoveContainer" containerID="bc82859256969c9357343a5e18001e96a48fee923b4c7ecf815c5f3ec200bba7" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.601678 4594 scope.go:117] "RemoveContainer" containerID="1667d485be30c5588ddc6c4d063ed8ab5065d1026dc4eb57036b2cc373cff165" Nov 29 06:06:54 crc kubenswrapper[4594]: E1129 06:06:54.601990 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1667d485be30c5588ddc6c4d063ed8ab5065d1026dc4eb57036b2cc373cff165\": container with ID starting with 1667d485be30c5588ddc6c4d063ed8ab5065d1026dc4eb57036b2cc373cff165 not found: ID does not exist" containerID="1667d485be30c5588ddc6c4d063ed8ab5065d1026dc4eb57036b2cc373cff165" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.602034 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1667d485be30c5588ddc6c4d063ed8ab5065d1026dc4eb57036b2cc373cff165"} err="failed to get container status \"1667d485be30c5588ddc6c4d063ed8ab5065d1026dc4eb57036b2cc373cff165\": rpc error: code = NotFound desc = could not find container \"1667d485be30c5588ddc6c4d063ed8ab5065d1026dc4eb57036b2cc373cff165\": container with ID starting with 1667d485be30c5588ddc6c4d063ed8ab5065d1026dc4eb57036b2cc373cff165 not found: ID does not exist" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.602060 4594 scope.go:117] "RemoveContainer" containerID="158fc51a173ef5f6ad1b34cd6ba30ce2f7418c08c4b6ba4408baa0bc902e51cd" Nov 29 06:06:54 crc kubenswrapper[4594]: E1129 06:06:54.602471 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"158fc51a173ef5f6ad1b34cd6ba30ce2f7418c08c4b6ba4408baa0bc902e51cd\": container with ID starting with 158fc51a173ef5f6ad1b34cd6ba30ce2f7418c08c4b6ba4408baa0bc902e51cd not found: ID does not exist" containerID="158fc51a173ef5f6ad1b34cd6ba30ce2f7418c08c4b6ba4408baa0bc902e51cd" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.602516 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"158fc51a173ef5f6ad1b34cd6ba30ce2f7418c08c4b6ba4408baa0bc902e51cd"} err="failed to get container status \"158fc51a173ef5f6ad1b34cd6ba30ce2f7418c08c4b6ba4408baa0bc902e51cd\": rpc error: code = NotFound desc = could not find container \"158fc51a173ef5f6ad1b34cd6ba30ce2f7418c08c4b6ba4408baa0bc902e51cd\": container with ID starting with 158fc51a173ef5f6ad1b34cd6ba30ce2f7418c08c4b6ba4408baa0bc902e51cd not found: ID does not exist" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.602543 4594 scope.go:117] "RemoveContainer" containerID="bc82859256969c9357343a5e18001e96a48fee923b4c7ecf815c5f3ec200bba7" Nov 29 06:06:54 crc kubenswrapper[4594]: E1129 06:06:54.602799 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc82859256969c9357343a5e18001e96a48fee923b4c7ecf815c5f3ec200bba7\": container with ID starting with bc82859256969c9357343a5e18001e96a48fee923b4c7ecf815c5f3ec200bba7 not found: ID does not exist" containerID="bc82859256969c9357343a5e18001e96a48fee923b4c7ecf815c5f3ec200bba7" Nov 29 06:06:54 crc kubenswrapper[4594]: I1129 06:06:54.602820 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc82859256969c9357343a5e18001e96a48fee923b4c7ecf815c5f3ec200bba7"} err="failed to get container status \"bc82859256969c9357343a5e18001e96a48fee923b4c7ecf815c5f3ec200bba7\": rpc error: code = NotFound desc = could not find container \"bc82859256969c9357343a5e18001e96a48fee923b4c7ecf815c5f3ec200bba7\": container with ID starting with bc82859256969c9357343a5e18001e96a48fee923b4c7ecf815c5f3ec200bba7 not found: ID does not exist" Nov 29 06:06:56 crc kubenswrapper[4594]: I1129 06:06:56.097873 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9ce76d3-605e-421a-988e-27cb1040252d" path="/var/lib/kubelet/pods/b9ce76d3-605e-421a-988e-27cb1040252d/volumes" Nov 29 06:06:57 crc kubenswrapper[4594]: I1129 06:06:57.545625 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 06:06:57 crc kubenswrapper[4594]: I1129 06:06:57.546202 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="prometheus" containerID="cri-o://166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b" gracePeriod=600 Nov 29 06:06:57 crc kubenswrapper[4594]: I1129 06:06:57.546348 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="thanos-sidecar" containerID="cri-o://fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a" gracePeriod=600 Nov 29 06:06:57 crc kubenswrapper[4594]: I1129 06:06:57.546406 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="config-reloader" containerID="cri-o://be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081" gracePeriod=600 Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.499618 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.559951 4594 generic.go:334] "Generic (PLEG): container finished" podID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerID="fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a" exitCode=0 Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.560220 4594 generic.go:334] "Generic (PLEG): container finished" podID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerID="be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081" exitCode=0 Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.560231 4594 generic.go:334] "Generic (PLEG): container finished" podID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerID="166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b" exitCode=0 Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.560135 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e44006f-3a80-4bf2-aac6-8d5c664d0db8","Type":"ContainerDied","Data":"fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a"} Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.560266 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.560288 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e44006f-3a80-4bf2-aac6-8d5c664d0db8","Type":"ContainerDied","Data":"be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081"} Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.560306 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e44006f-3a80-4bf2-aac6-8d5c664d0db8","Type":"ContainerDied","Data":"166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b"} Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.560317 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e44006f-3a80-4bf2-aac6-8d5c664d0db8","Type":"ContainerDied","Data":"9a07a121ecff1d9e0616fe2b62eb67c4d6f3b7903061218ed0135e9f517fb32f"} Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.560334 4594 scope.go:117] "RemoveContainer" containerID="fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.593921 4594 scope.go:117] "RemoveContainer" containerID="be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.618836 4594 scope.go:117] "RemoveContainer" containerID="166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.639443 4594 scope.go:117] "RemoveContainer" containerID="e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.663924 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-config\") pod \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.664002 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-tls-assets\") pod \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.664156 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.668364 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.668439 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-prometheus-metric-storage-rulefiles-0\") pod \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.668514 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j796b\" (UniqueName: \"kubernetes.io/projected/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-kube-api-access-j796b\") pod \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.668583 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-config-out\") pod \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.668643 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-thanos-prometheus-http-client-file\") pod \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.668667 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.668786 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-secret-combined-ca-bundle\") pod \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.668809 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config\") pod \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\" (UID: \"0e44006f-3a80-4bf2-aac6-8d5c664d0db8\") " Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.670370 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "0e44006f-3a80-4bf2-aac6-8d5c664d0db8" (UID: "0e44006f-3a80-4bf2-aac6-8d5c664d0db8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.666318 4594 scope.go:117] "RemoveContainer" containerID="fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.673921 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "0e44006f-3a80-4bf2-aac6-8d5c664d0db8" (UID: "0e44006f-3a80-4bf2-aac6-8d5c664d0db8"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.673994 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-config" (OuterVolumeSpecName: "config") pod "0e44006f-3a80-4bf2-aac6-8d5c664d0db8" (UID: "0e44006f-3a80-4bf2-aac6-8d5c664d0db8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.674140 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0e44006f-3a80-4bf2-aac6-8d5c664d0db8" (UID: "0e44006f-3a80-4bf2-aac6-8d5c664d0db8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.674216 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a\": container with ID starting with fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a not found: ID does not exist" containerID="fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.674299 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a"} err="failed to get container status \"fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a\": rpc error: code = NotFound desc = could not find container \"fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a\": container with ID starting with fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a not found: ID does not exist" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.674331 4594 scope.go:117] "RemoveContainer" containerID="be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.675676 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "0e44006f-3a80-4bf2-aac6-8d5c664d0db8" (UID: "0e44006f-3a80-4bf2-aac6-8d5c664d0db8"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.675713 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "0e44006f-3a80-4bf2-aac6-8d5c664d0db8" (UID: "0e44006f-3a80-4bf2-aac6-8d5c664d0db8"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.675779 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081\": container with ID starting with be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081 not found: ID does not exist" containerID="be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.675809 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081"} err="failed to get container status \"be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081\": rpc error: code = NotFound desc = could not find container \"be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081\": container with ID starting with be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081 not found: ID does not exist" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.675837 4594 scope.go:117] "RemoveContainer" containerID="166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.676109 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-config-out" (OuterVolumeSpecName: "config-out") pod "0e44006f-3a80-4bf2-aac6-8d5c664d0db8" (UID: "0e44006f-3a80-4bf2-aac6-8d5c664d0db8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.678372 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b\": container with ID starting with 166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b not found: ID does not exist" containerID="166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.678404 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b"} err="failed to get container status \"166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b\": rpc error: code = NotFound desc = could not find container \"166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b\": container with ID starting with 166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b not found: ID does not exist" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.678423 4594 scope.go:117] "RemoveContainer" containerID="e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e" Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.678683 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e\": container with ID starting with e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e not found: ID does not exist" containerID="e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.678724 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e"} err="failed to get container status \"e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e\": rpc error: code = NotFound desc = could not find container \"e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e\": container with ID starting with e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e not found: ID does not exist" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.678751 4594 scope.go:117] "RemoveContainer" containerID="fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.678963 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a"} err="failed to get container status \"fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a\": rpc error: code = NotFound desc = could not find container \"fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a\": container with ID starting with fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a not found: ID does not exist" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.678989 4594 scope.go:117] "RemoveContainer" containerID="be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.679163 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081"} err="failed to get container status \"be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081\": rpc error: code = NotFound desc = could not find container \"be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081\": container with ID starting with be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081 not found: ID does not exist" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.679185 4594 scope.go:117] "RemoveContainer" containerID="166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.679362 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b"} err="failed to get container status \"166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b\": rpc error: code = NotFound desc = could not find container \"166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b\": container with ID starting with 166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b not found: ID does not exist" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.679384 4594 scope.go:117] "RemoveContainer" containerID="e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.679574 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e"} err="failed to get container status \"e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e\": rpc error: code = NotFound desc = could not find container \"e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e\": container with ID starting with e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e not found: ID does not exist" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.679595 4594 scope.go:117] "RemoveContainer" containerID="fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.679752 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a"} err="failed to get container status \"fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a\": rpc error: code = NotFound desc = could not find container \"fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a\": container with ID starting with fc145f1edd82523cedcba03cf3914a13e3aa3771689236806684c1e3f0e0235a not found: ID does not exist" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.679799 4594 scope.go:117] "RemoveContainer" containerID="be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.680010 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081"} err="failed to get container status \"be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081\": rpc error: code = NotFound desc = could not find container \"be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081\": container with ID starting with be4198c341771d572e7f8cb89a81015d353a914620d6952932c307cf4e696081 not found: ID does not exist" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.680061 4594 scope.go:117] "RemoveContainer" containerID="166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.680302 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b"} err="failed to get container status \"166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b\": rpc error: code = NotFound desc = could not find container \"166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b\": container with ID starting with 166e5dee2f459757e2d43ad4d79253879c3974e95bd51eb52e72a8ff3d58434b not found: ID does not exist" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.680324 4594 scope.go:117] "RemoveContainer" containerID="e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.680552 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e"} err="failed to get container status \"e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e\": rpc error: code = NotFound desc = could not find container \"e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e\": container with ID starting with e98a79c3762747cdacae623e61f5f0539a6ff41851db8e295ebe252008ae3d4e not found: ID does not exist" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.682073 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-kube-api-access-j796b" (OuterVolumeSpecName: "kube-api-access-j796b") pod "0e44006f-3a80-4bf2-aac6-8d5c664d0db8" (UID: "0e44006f-3a80-4bf2-aac6-8d5c664d0db8"). InnerVolumeSpecName "kube-api-access-j796b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.686395 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "0e44006f-3a80-4bf2-aac6-8d5c664d0db8" (UID: "0e44006f-3a80-4bf2-aac6-8d5c664d0db8"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.693884 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "0e44006f-3a80-4bf2-aac6-8d5c664d0db8" (UID: "0e44006f-3a80-4bf2-aac6-8d5c664d0db8"). InnerVolumeSpecName "pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.744267 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config" (OuterVolumeSpecName: "web-config") pod "0e44006f-3a80-4bf2-aac6-8d5c664d0db8" (UID: "0e44006f-3a80-4bf2-aac6-8d5c664d0db8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.772085 4594 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.772118 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j796b\" (UniqueName: \"kubernetes.io/projected/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-kube-api-access-j796b\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.772131 4594 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-config-out\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.772143 4594 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.772155 4594 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.772168 4594 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.772180 4594 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.772189 4594 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-config\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.772197 4594 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-tls-assets\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.772234 4594 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") on node \"crc\" " Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.772246 4594 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e44006f-3a80-4bf2-aac6-8d5c664d0db8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.791733 4594 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.791862 4594 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d") on node "crc" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.874103 4594 reconciler_common.go:293] "Volume detached for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") on node \"crc\" DevicePath \"\"" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.900555 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.912131 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.927369 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.927793 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" containerName="registry-server" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.927812 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" containerName="registry-server" Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.927823 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="prometheus" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.927829 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="prometheus" Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.927839 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="thanos-sidecar" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.927846 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="thanos-sidecar" Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.927860 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ce76d3-605e-421a-988e-27cb1040252d" containerName="extract-utilities" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.927866 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ce76d3-605e-421a-988e-27cb1040252d" containerName="extract-utilities" Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.927875 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ce76d3-605e-421a-988e-27cb1040252d" containerName="registry-server" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.927880 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ce76d3-605e-421a-988e-27cb1040252d" containerName="registry-server" Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.927898 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="init-config-reloader" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.927903 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="init-config-reloader" Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.927922 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="config-reloader" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.927927 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="config-reloader" Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.927941 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" containerName="extract-utilities" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.927948 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" containerName="extract-utilities" Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.927957 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" containerName="extract-content" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.927962 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" containerName="extract-content" Nov 29 06:06:58 crc kubenswrapper[4594]: E1129 06:06:58.927973 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ce76d3-605e-421a-988e-27cb1040252d" containerName="extract-content" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.927979 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ce76d3-605e-421a-988e-27cb1040252d" containerName="extract-content" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.928147 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="config-reloader" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.928160 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="prometheus" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.928169 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7bf5d8b-0a1e-4ba1-99dc-27f3355a64f9" containerName="registry-server" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.928181 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" containerName="thanos-sidecar" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.928194 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ce76d3-605e-421a-988e-27cb1040252d" containerName="registry-server" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.929838 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.932328 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-w8z2f" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.932519 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.932533 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.932983 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.937554 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.940094 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Nov 29 06:06:58 crc kubenswrapper[4594]: I1129 06:06:58.942123 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.079537 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.079626 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njj2r\" (UniqueName: \"kubernetes.io/projected/83de9c5a-c56c-4433-9a32-48972cdd1b46-kube-api-access-njj2r\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.079678 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.079845 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.079898 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.079938 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.080042 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/83de9c5a-c56c-4433-9a32-48972cdd1b46-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.080093 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/83de9c5a-c56c-4433-9a32-48972cdd1b46-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.080229 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-config\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.080389 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/83de9c5a-c56c-4433-9a32-48972cdd1b46-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.080470 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.182942 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/83de9c5a-c56c-4433-9a32-48972cdd1b46-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.183378 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.183405 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.183445 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njj2r\" (UniqueName: \"kubernetes.io/projected/83de9c5a-c56c-4433-9a32-48972cdd1b46-kube-api-access-njj2r\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.183476 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.183557 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.183591 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.183628 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.183670 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/83de9c5a-c56c-4433-9a32-48972cdd1b46-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.183703 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/83de9c5a-c56c-4433-9a32-48972cdd1b46-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.183831 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-config\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.185504 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/83de9c5a-c56c-4433-9a32-48972cdd1b46-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.187631 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/83de9c5a-c56c-4433-9a32-48972cdd1b46-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.188747 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.189117 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.189829 4594 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.189865 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/740def2cd7269ac7c6453fb83dbbc5807ffe59ef523e7a581c6b5220b8504e7f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.190407 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.191895 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.192592 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/83de9c5a-c56c-4433-9a32-48972cdd1b46-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.195582 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.200781 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/83de9c5a-c56c-4433-9a32-48972cdd1b46-config\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.201589 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njj2r\" (UniqueName: \"kubernetes.io/projected/83de9c5a-c56c-4433-9a32-48972cdd1b46-kube-api-access-njj2r\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.236444 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-85b0c002-5ac7-4caa-8203-64f617b7cd7d\") pod \"prometheus-metric-storage-0\" (UID: \"83de9c5a-c56c-4433-9a32-48972cdd1b46\") " pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.245150 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 06:06:59 crc kubenswrapper[4594]: I1129 06:06:59.684970 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 06:07:00 crc kubenswrapper[4594]: I1129 06:07:00.094597 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e44006f-3a80-4bf2-aac6-8d5c664d0db8" path="/var/lib/kubelet/pods/0e44006f-3a80-4bf2-aac6-8d5c664d0db8/volumes" Nov 29 06:07:00 crc kubenswrapper[4594]: I1129 06:07:00.611890 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83de9c5a-c56c-4433-9a32-48972cdd1b46","Type":"ContainerStarted","Data":"8fa6817f5604bfa999c3d71a64c86b1c8aa38ef2c3dbbe338ca3ad7a89e6041d"} Nov 29 06:07:03 crc kubenswrapper[4594]: I1129 06:07:03.642348 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83de9c5a-c56c-4433-9a32-48972cdd1b46","Type":"ContainerStarted","Data":"691bbe657bd1a7a2f0b251ce7c413c6c8dbf6465ca251c9a6291076565ed44b8"} Nov 29 06:07:08 crc kubenswrapper[4594]: I1129 06:07:08.689211 4594 generic.go:334] "Generic (PLEG): container finished" podID="83de9c5a-c56c-4433-9a32-48972cdd1b46" containerID="691bbe657bd1a7a2f0b251ce7c413c6c8dbf6465ca251c9a6291076565ed44b8" exitCode=0 Nov 29 06:07:08 crc kubenswrapper[4594]: I1129 06:07:08.689392 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83de9c5a-c56c-4433-9a32-48972cdd1b46","Type":"ContainerDied","Data":"691bbe657bd1a7a2f0b251ce7c413c6c8dbf6465ca251c9a6291076565ed44b8"} Nov 29 06:07:09 crc kubenswrapper[4594]: I1129 06:07:09.714395 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83de9c5a-c56c-4433-9a32-48972cdd1b46","Type":"ContainerStarted","Data":"0c67c4aaafd4f37f7ab68cedb043cbe75402a3f59cf6116b8d4bb46be163987e"} Nov 29 06:07:12 crc kubenswrapper[4594]: I1129 06:07:12.753381 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83de9c5a-c56c-4433-9a32-48972cdd1b46","Type":"ContainerStarted","Data":"2a4819f88d0f38d6761ec18e576573eac1b7395b2269b53506e3479434f326ee"} Nov 29 06:07:12 crc kubenswrapper[4594]: I1129 06:07:12.753993 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83de9c5a-c56c-4433-9a32-48972cdd1b46","Type":"ContainerStarted","Data":"13de321a4427dc2df47851f06bfbfc934614bf0206c20958a35e38b9b08aac89"} Nov 29 06:07:12 crc kubenswrapper[4594]: I1129 06:07:12.786005 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.785987288 podStartE2EDuration="14.785987288s" podCreationTimestamp="2025-11-29 06:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 06:07:12.775790761 +0000 UTC m=+2357.016299980" watchObservedRunningTime="2025-11-29 06:07:12.785987288 +0000 UTC m=+2357.026496508" Nov 29 06:07:14 crc kubenswrapper[4594]: I1129 06:07:14.245863 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Nov 29 06:07:14 crc kubenswrapper[4594]: I1129 06:07:14.245923 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Nov 29 06:07:14 crc kubenswrapper[4594]: I1129 06:07:14.251903 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Nov 29 06:07:14 crc kubenswrapper[4594]: I1129 06:07:14.785283 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Nov 29 06:07:15 crc kubenswrapper[4594]: I1129 06:07:15.800846 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:07:15 crc kubenswrapper[4594]: I1129 06:07:15.800909 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.449606 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.451593 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.453712 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.454184 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.454387 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lm66r" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.457416 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.462847 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.584182 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.584245 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.584329 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjmdh\" (UniqueName: \"kubernetes.io/projected/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-kube-api-access-zjmdh\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.584377 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-config-data\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.584401 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.584434 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.584452 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.584610 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.584720 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.687863 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.688098 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.688227 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.688421 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.688802 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.689590 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjmdh\" (UniqueName: \"kubernetes.io/projected/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-kube-api-access-zjmdh\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.689637 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-config-data\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.689664 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.689698 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.689724 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.689819 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.690481 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.691487 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.691645 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-config-data\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.697763 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.698832 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.699191 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.705826 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjmdh\" (UniqueName: \"kubernetes.io/projected/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-kube-api-access-zjmdh\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.717102 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " pod="openstack/tempest-tests-tempest" Nov 29 06:07:44 crc kubenswrapper[4594]: I1129 06:07:44.771867 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 29 06:07:45 crc kubenswrapper[4594]: I1129 06:07:45.196603 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 29 06:07:45 crc kubenswrapper[4594]: I1129 06:07:45.800674 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:07:45 crc kubenswrapper[4594]: I1129 06:07:45.801744 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:07:46 crc kubenswrapper[4594]: I1129 06:07:46.110682 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad","Type":"ContainerStarted","Data":"0618685a982b09ad35417b449a632512cfac05da0caf927554bcc12abf3ab382"} Nov 29 06:07:59 crc kubenswrapper[4594]: I1129 06:07:59.437784 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 29 06:08:00 crc kubenswrapper[4594]: I1129 06:08:00.283601 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad","Type":"ContainerStarted","Data":"79da2dc66d89449b2bbdafcb72784e37c6308918ff21277920f29722555a1081"} Nov 29 06:08:00 crc kubenswrapper[4594]: I1129 06:08:00.304036 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.075828011 podStartE2EDuration="17.304024086s" podCreationTimestamp="2025-11-29 06:07:43 +0000 UTC" firstStartedPulling="2025-11-29 06:07:45.206523758 +0000 UTC m=+2389.447032978" lastFinishedPulling="2025-11-29 06:07:59.434719833 +0000 UTC m=+2403.675229053" observedRunningTime="2025-11-29 06:08:00.299462447 +0000 UTC m=+2404.539971667" watchObservedRunningTime="2025-11-29 06:08:00.304024086 +0000 UTC m=+2404.544533305" Nov 29 06:08:15 crc kubenswrapper[4594]: I1129 06:08:15.800335 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:08:15 crc kubenswrapper[4594]: I1129 06:08:15.801073 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:08:15 crc kubenswrapper[4594]: I1129 06:08:15.801147 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 06:08:15 crc kubenswrapper[4594]: I1129 06:08:15.801955 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 06:08:15 crc kubenswrapper[4594]: I1129 06:08:15.802017 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" gracePeriod=600 Nov 29 06:08:15 crc kubenswrapper[4594]: E1129 06:08:15.936209 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:08:16 crc kubenswrapper[4594]: I1129 06:08:16.461709 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" exitCode=0 Nov 29 06:08:16 crc kubenswrapper[4594]: I1129 06:08:16.461914 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9"} Nov 29 06:08:16 crc kubenswrapper[4594]: I1129 06:08:16.462094 4594 scope.go:117] "RemoveContainer" containerID="9e18bd53322063d829249c7999cf4c97247a11382fee69b79893aa2fbd67ae57" Nov 29 06:08:16 crc kubenswrapper[4594]: I1129 06:08:16.463112 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:08:16 crc kubenswrapper[4594]: E1129 06:08:16.463618 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:08:28 crc kubenswrapper[4594]: I1129 06:08:28.084433 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:08:28 crc kubenswrapper[4594]: E1129 06:08:28.085444 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:08:40 crc kubenswrapper[4594]: I1129 06:08:40.084950 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:08:40 crc kubenswrapper[4594]: E1129 06:08:40.086481 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:08:53 crc kubenswrapper[4594]: I1129 06:08:53.084235 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:08:53 crc kubenswrapper[4594]: E1129 06:08:53.085701 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:09:04 crc kubenswrapper[4594]: I1129 06:09:04.085014 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:09:04 crc kubenswrapper[4594]: E1129 06:09:04.087145 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:09:16 crc kubenswrapper[4594]: I1129 06:09:16.091763 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:09:16 crc kubenswrapper[4594]: E1129 06:09:16.092887 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:09:29 crc kubenswrapper[4594]: I1129 06:09:29.083276 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:09:29 crc kubenswrapper[4594]: E1129 06:09:29.084119 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:09:42 crc kubenswrapper[4594]: I1129 06:09:42.083945 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:09:42 crc kubenswrapper[4594]: E1129 06:09:42.085114 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:09:55 crc kubenswrapper[4594]: I1129 06:09:55.082970 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:09:55 crc kubenswrapper[4594]: E1129 06:09:55.083673 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:10:10 crc kubenswrapper[4594]: I1129 06:10:10.084518 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:10:10 crc kubenswrapper[4594]: E1129 06:10:10.085948 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:10:25 crc kubenswrapper[4594]: I1129 06:10:25.084204 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:10:25 crc kubenswrapper[4594]: E1129 06:10:25.085162 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:10:38 crc kubenswrapper[4594]: I1129 06:10:38.083496 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:10:38 crc kubenswrapper[4594]: E1129 06:10:38.084276 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:10:53 crc kubenswrapper[4594]: I1129 06:10:53.083905 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:10:53 crc kubenswrapper[4594]: E1129 06:10:53.084770 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:11:05 crc kubenswrapper[4594]: I1129 06:11:05.084396 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:11:05 crc kubenswrapper[4594]: E1129 06:11:05.085722 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:11:17 crc kubenswrapper[4594]: I1129 06:11:17.085951 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:11:17 crc kubenswrapper[4594]: E1129 06:11:17.086834 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:11:31 crc kubenswrapper[4594]: I1129 06:11:31.083305 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:11:31 crc kubenswrapper[4594]: E1129 06:11:31.084377 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:11:42 crc kubenswrapper[4594]: I1129 06:11:42.083824 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:11:42 crc kubenswrapper[4594]: E1129 06:11:42.084737 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:11:48 crc kubenswrapper[4594]: I1129 06:11:48.815752 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b4x4z"] Nov 29 06:11:48 crc kubenswrapper[4594]: I1129 06:11:48.818223 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:48 crc kubenswrapper[4594]: I1129 06:11:48.830836 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4x4z"] Nov 29 06:11:48 crc kubenswrapper[4594]: I1129 06:11:48.885973 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjd9v\" (UniqueName: \"kubernetes.io/projected/23986239-352d-4716-9c73-5647e64ac645-kube-api-access-fjd9v\") pod \"redhat-marketplace-b4x4z\" (UID: \"23986239-352d-4716-9c73-5647e64ac645\") " pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:48 crc kubenswrapper[4594]: I1129 06:11:48.886023 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23986239-352d-4716-9c73-5647e64ac645-utilities\") pod \"redhat-marketplace-b4x4z\" (UID: \"23986239-352d-4716-9c73-5647e64ac645\") " pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:48 crc kubenswrapper[4594]: I1129 06:11:48.886091 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23986239-352d-4716-9c73-5647e64ac645-catalog-content\") pod \"redhat-marketplace-b4x4z\" (UID: \"23986239-352d-4716-9c73-5647e64ac645\") " pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:48 crc kubenswrapper[4594]: I1129 06:11:48.988778 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjd9v\" (UniqueName: \"kubernetes.io/projected/23986239-352d-4716-9c73-5647e64ac645-kube-api-access-fjd9v\") pod \"redhat-marketplace-b4x4z\" (UID: \"23986239-352d-4716-9c73-5647e64ac645\") " pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:48 crc kubenswrapper[4594]: I1129 06:11:48.988857 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23986239-352d-4716-9c73-5647e64ac645-utilities\") pod \"redhat-marketplace-b4x4z\" (UID: \"23986239-352d-4716-9c73-5647e64ac645\") " pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:48 crc kubenswrapper[4594]: I1129 06:11:48.988910 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23986239-352d-4716-9c73-5647e64ac645-catalog-content\") pod \"redhat-marketplace-b4x4z\" (UID: \"23986239-352d-4716-9c73-5647e64ac645\") " pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:48 crc kubenswrapper[4594]: I1129 06:11:48.989486 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23986239-352d-4716-9c73-5647e64ac645-utilities\") pod \"redhat-marketplace-b4x4z\" (UID: \"23986239-352d-4716-9c73-5647e64ac645\") " pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:48 crc kubenswrapper[4594]: I1129 06:11:48.989533 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23986239-352d-4716-9c73-5647e64ac645-catalog-content\") pod \"redhat-marketplace-b4x4z\" (UID: \"23986239-352d-4716-9c73-5647e64ac645\") " pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:49 crc kubenswrapper[4594]: I1129 06:11:49.009105 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjd9v\" (UniqueName: \"kubernetes.io/projected/23986239-352d-4716-9c73-5647e64ac645-kube-api-access-fjd9v\") pod \"redhat-marketplace-b4x4z\" (UID: \"23986239-352d-4716-9c73-5647e64ac645\") " pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:49 crc kubenswrapper[4594]: I1129 06:11:49.142381 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:49 crc kubenswrapper[4594]: I1129 06:11:49.584161 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4x4z"] Nov 29 06:11:50 crc kubenswrapper[4594]: I1129 06:11:50.504176 4594 generic.go:334] "Generic (PLEG): container finished" podID="23986239-352d-4716-9c73-5647e64ac645" containerID="74c4d669abc0ba68262d9f0cabf9bbb55ea46b15000c064152075e942d121d87" exitCode=0 Nov 29 06:11:50 crc kubenswrapper[4594]: I1129 06:11:50.504333 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4x4z" event={"ID":"23986239-352d-4716-9c73-5647e64ac645","Type":"ContainerDied","Data":"74c4d669abc0ba68262d9f0cabf9bbb55ea46b15000c064152075e942d121d87"} Nov 29 06:11:50 crc kubenswrapper[4594]: I1129 06:11:50.504382 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4x4z" event={"ID":"23986239-352d-4716-9c73-5647e64ac645","Type":"ContainerStarted","Data":"d74e119fa4292cd908497f65c2009b51f0310966f6f008a948059f9df90b4e1c"} Nov 29 06:11:50 crc kubenswrapper[4594]: I1129 06:11:50.507555 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 06:11:51 crc kubenswrapper[4594]: I1129 06:11:51.532439 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4x4z" event={"ID":"23986239-352d-4716-9c73-5647e64ac645","Type":"ContainerStarted","Data":"2881edae052e80fdfa1b2ca397024f1acbccdb6509a0f0f05f20f16bc6a7f3de"} Nov 29 06:11:52 crc kubenswrapper[4594]: I1129 06:11:52.543610 4594 generic.go:334] "Generic (PLEG): container finished" podID="23986239-352d-4716-9c73-5647e64ac645" containerID="2881edae052e80fdfa1b2ca397024f1acbccdb6509a0f0f05f20f16bc6a7f3de" exitCode=0 Nov 29 06:11:52 crc kubenswrapper[4594]: I1129 06:11:52.543757 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4x4z" event={"ID":"23986239-352d-4716-9c73-5647e64ac645","Type":"ContainerDied","Data":"2881edae052e80fdfa1b2ca397024f1acbccdb6509a0f0f05f20f16bc6a7f3de"} Nov 29 06:11:53 crc kubenswrapper[4594]: I1129 06:11:53.557507 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4x4z" event={"ID":"23986239-352d-4716-9c73-5647e64ac645","Type":"ContainerStarted","Data":"ae7019ac13500e4a3711ca46783c964a577766aa391fa838b1a6a87849d2cc45"} Nov 29 06:11:53 crc kubenswrapper[4594]: I1129 06:11:53.581785 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b4x4z" podStartSLOduration=3.038404244 podStartE2EDuration="5.581763014s" podCreationTimestamp="2025-11-29 06:11:48 +0000 UTC" firstStartedPulling="2025-11-29 06:11:50.507332965 +0000 UTC m=+2634.747842185" lastFinishedPulling="2025-11-29 06:11:53.050691734 +0000 UTC m=+2637.291200955" observedRunningTime="2025-11-29 06:11:53.57383713 +0000 UTC m=+2637.814346351" watchObservedRunningTime="2025-11-29 06:11:53.581763014 +0000 UTC m=+2637.822272234" Nov 29 06:11:57 crc kubenswrapper[4594]: I1129 06:11:57.084675 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:11:57 crc kubenswrapper[4594]: E1129 06:11:57.085669 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:11:59 crc kubenswrapper[4594]: I1129 06:11:59.143325 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:59 crc kubenswrapper[4594]: I1129 06:11:59.144227 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:59 crc kubenswrapper[4594]: I1129 06:11:59.187248 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:59 crc kubenswrapper[4594]: I1129 06:11:59.654805 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:11:59 crc kubenswrapper[4594]: I1129 06:11:59.713795 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4x4z"] Nov 29 06:12:01 crc kubenswrapper[4594]: I1129 06:12:01.634519 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b4x4z" podUID="23986239-352d-4716-9c73-5647e64ac645" containerName="registry-server" containerID="cri-o://ae7019ac13500e4a3711ca46783c964a577766aa391fa838b1a6a87849d2cc45" gracePeriod=2 Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.077643 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.200461 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23986239-352d-4716-9c73-5647e64ac645-utilities\") pod \"23986239-352d-4716-9c73-5647e64ac645\" (UID: \"23986239-352d-4716-9c73-5647e64ac645\") " Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.200619 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjd9v\" (UniqueName: \"kubernetes.io/projected/23986239-352d-4716-9c73-5647e64ac645-kube-api-access-fjd9v\") pod \"23986239-352d-4716-9c73-5647e64ac645\" (UID: \"23986239-352d-4716-9c73-5647e64ac645\") " Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.200666 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23986239-352d-4716-9c73-5647e64ac645-catalog-content\") pod \"23986239-352d-4716-9c73-5647e64ac645\" (UID: \"23986239-352d-4716-9c73-5647e64ac645\") " Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.201300 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23986239-352d-4716-9c73-5647e64ac645-utilities" (OuterVolumeSpecName: "utilities") pod "23986239-352d-4716-9c73-5647e64ac645" (UID: "23986239-352d-4716-9c73-5647e64ac645"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.201564 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23986239-352d-4716-9c73-5647e64ac645-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.209559 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23986239-352d-4716-9c73-5647e64ac645-kube-api-access-fjd9v" (OuterVolumeSpecName: "kube-api-access-fjd9v") pod "23986239-352d-4716-9c73-5647e64ac645" (UID: "23986239-352d-4716-9c73-5647e64ac645"). InnerVolumeSpecName "kube-api-access-fjd9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.216234 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23986239-352d-4716-9c73-5647e64ac645-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23986239-352d-4716-9c73-5647e64ac645" (UID: "23986239-352d-4716-9c73-5647e64ac645"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.302853 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjd9v\" (UniqueName: \"kubernetes.io/projected/23986239-352d-4716-9c73-5647e64ac645-kube-api-access-fjd9v\") on node \"crc\" DevicePath \"\"" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.302895 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23986239-352d-4716-9c73-5647e64ac645-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.646625 4594 generic.go:334] "Generic (PLEG): container finished" podID="23986239-352d-4716-9c73-5647e64ac645" containerID="ae7019ac13500e4a3711ca46783c964a577766aa391fa838b1a6a87849d2cc45" exitCode=0 Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.646681 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b4x4z" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.646729 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4x4z" event={"ID":"23986239-352d-4716-9c73-5647e64ac645","Type":"ContainerDied","Data":"ae7019ac13500e4a3711ca46783c964a577766aa391fa838b1a6a87849d2cc45"} Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.647168 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4x4z" event={"ID":"23986239-352d-4716-9c73-5647e64ac645","Type":"ContainerDied","Data":"d74e119fa4292cd908497f65c2009b51f0310966f6f008a948059f9df90b4e1c"} Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.647193 4594 scope.go:117] "RemoveContainer" containerID="ae7019ac13500e4a3711ca46783c964a577766aa391fa838b1a6a87849d2cc45" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.671232 4594 scope.go:117] "RemoveContainer" containerID="2881edae052e80fdfa1b2ca397024f1acbccdb6509a0f0f05f20f16bc6a7f3de" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.681805 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4x4z"] Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.692578 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4x4z"] Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.702310 4594 scope.go:117] "RemoveContainer" containerID="74c4d669abc0ba68262d9f0cabf9bbb55ea46b15000c064152075e942d121d87" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.732932 4594 scope.go:117] "RemoveContainer" containerID="ae7019ac13500e4a3711ca46783c964a577766aa391fa838b1a6a87849d2cc45" Nov 29 06:12:02 crc kubenswrapper[4594]: E1129 06:12:02.733303 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae7019ac13500e4a3711ca46783c964a577766aa391fa838b1a6a87849d2cc45\": container with ID starting with ae7019ac13500e4a3711ca46783c964a577766aa391fa838b1a6a87849d2cc45 not found: ID does not exist" containerID="ae7019ac13500e4a3711ca46783c964a577766aa391fa838b1a6a87849d2cc45" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.733353 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7019ac13500e4a3711ca46783c964a577766aa391fa838b1a6a87849d2cc45"} err="failed to get container status \"ae7019ac13500e4a3711ca46783c964a577766aa391fa838b1a6a87849d2cc45\": rpc error: code = NotFound desc = could not find container \"ae7019ac13500e4a3711ca46783c964a577766aa391fa838b1a6a87849d2cc45\": container with ID starting with ae7019ac13500e4a3711ca46783c964a577766aa391fa838b1a6a87849d2cc45 not found: ID does not exist" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.733388 4594 scope.go:117] "RemoveContainer" containerID="2881edae052e80fdfa1b2ca397024f1acbccdb6509a0f0f05f20f16bc6a7f3de" Nov 29 06:12:02 crc kubenswrapper[4594]: E1129 06:12:02.733691 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2881edae052e80fdfa1b2ca397024f1acbccdb6509a0f0f05f20f16bc6a7f3de\": container with ID starting with 2881edae052e80fdfa1b2ca397024f1acbccdb6509a0f0f05f20f16bc6a7f3de not found: ID does not exist" containerID="2881edae052e80fdfa1b2ca397024f1acbccdb6509a0f0f05f20f16bc6a7f3de" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.733733 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2881edae052e80fdfa1b2ca397024f1acbccdb6509a0f0f05f20f16bc6a7f3de"} err="failed to get container status \"2881edae052e80fdfa1b2ca397024f1acbccdb6509a0f0f05f20f16bc6a7f3de\": rpc error: code = NotFound desc = could not find container \"2881edae052e80fdfa1b2ca397024f1acbccdb6509a0f0f05f20f16bc6a7f3de\": container with ID starting with 2881edae052e80fdfa1b2ca397024f1acbccdb6509a0f0f05f20f16bc6a7f3de not found: ID does not exist" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.733758 4594 scope.go:117] "RemoveContainer" containerID="74c4d669abc0ba68262d9f0cabf9bbb55ea46b15000c064152075e942d121d87" Nov 29 06:12:02 crc kubenswrapper[4594]: E1129 06:12:02.734019 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c4d669abc0ba68262d9f0cabf9bbb55ea46b15000c064152075e942d121d87\": container with ID starting with 74c4d669abc0ba68262d9f0cabf9bbb55ea46b15000c064152075e942d121d87 not found: ID does not exist" containerID="74c4d669abc0ba68262d9f0cabf9bbb55ea46b15000c064152075e942d121d87" Nov 29 06:12:02 crc kubenswrapper[4594]: I1129 06:12:02.734045 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c4d669abc0ba68262d9f0cabf9bbb55ea46b15000c064152075e942d121d87"} err="failed to get container status \"74c4d669abc0ba68262d9f0cabf9bbb55ea46b15000c064152075e942d121d87\": rpc error: code = NotFound desc = could not find container \"74c4d669abc0ba68262d9f0cabf9bbb55ea46b15000c064152075e942d121d87\": container with ID starting with 74c4d669abc0ba68262d9f0cabf9bbb55ea46b15000c064152075e942d121d87 not found: ID does not exist" Nov 29 06:12:04 crc kubenswrapper[4594]: I1129 06:12:04.094331 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23986239-352d-4716-9c73-5647e64ac645" path="/var/lib/kubelet/pods/23986239-352d-4716-9c73-5647e64ac645/volumes" Nov 29 06:12:10 crc kubenswrapper[4594]: I1129 06:12:10.083068 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:12:10 crc kubenswrapper[4594]: E1129 06:12:10.083809 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:12:25 crc kubenswrapper[4594]: I1129 06:12:25.082960 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:12:25 crc kubenswrapper[4594]: E1129 06:12:25.083787 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:12:40 crc kubenswrapper[4594]: I1129 06:12:40.083866 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:12:40 crc kubenswrapper[4594]: E1129 06:12:40.085437 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:12:53 crc kubenswrapper[4594]: I1129 06:12:53.083796 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:12:53 crc kubenswrapper[4594]: E1129 06:12:53.084674 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:13:07 crc kubenswrapper[4594]: I1129 06:13:07.083462 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:13:07 crc kubenswrapper[4594]: E1129 06:13:07.084379 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:13:19 crc kubenswrapper[4594]: I1129 06:13:19.083551 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:13:19 crc kubenswrapper[4594]: I1129 06:13:19.400217 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"df364d319c5e658882315707e2d6b30a3b4e584bc43035131218f0fc9ff1f120"} Nov 29 06:14:30 crc kubenswrapper[4594]: I1129 06:14:30.885834 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pbx4n"] Nov 29 06:14:30 crc kubenswrapper[4594]: E1129 06:14:30.887917 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23986239-352d-4716-9c73-5647e64ac645" containerName="extract-utilities" Nov 29 06:14:30 crc kubenswrapper[4594]: I1129 06:14:30.887937 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="23986239-352d-4716-9c73-5647e64ac645" containerName="extract-utilities" Nov 29 06:14:30 crc kubenswrapper[4594]: E1129 06:14:30.887955 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23986239-352d-4716-9c73-5647e64ac645" containerName="extract-content" Nov 29 06:14:30 crc kubenswrapper[4594]: I1129 06:14:30.887963 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="23986239-352d-4716-9c73-5647e64ac645" containerName="extract-content" Nov 29 06:14:30 crc kubenswrapper[4594]: E1129 06:14:30.887977 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23986239-352d-4716-9c73-5647e64ac645" containerName="registry-server" Nov 29 06:14:30 crc kubenswrapper[4594]: I1129 06:14:30.887982 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="23986239-352d-4716-9c73-5647e64ac645" containerName="registry-server" Nov 29 06:14:30 crc kubenswrapper[4594]: I1129 06:14:30.888235 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="23986239-352d-4716-9c73-5647e64ac645" containerName="registry-server" Nov 29 06:14:30 crc kubenswrapper[4594]: I1129 06:14:30.889830 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:30 crc kubenswrapper[4594]: I1129 06:14:30.896041 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7022d6b8-f6cf-4289-9857-20da0610d929-catalog-content\") pod \"community-operators-pbx4n\" (UID: \"7022d6b8-f6cf-4289-9857-20da0610d929\") " pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:30 crc kubenswrapper[4594]: I1129 06:14:30.896667 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7022d6b8-f6cf-4289-9857-20da0610d929-utilities\") pod \"community-operators-pbx4n\" (UID: \"7022d6b8-f6cf-4289-9857-20da0610d929\") " pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:30 crc kubenswrapper[4594]: I1129 06:14:30.896696 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78rmk\" (UniqueName: \"kubernetes.io/projected/7022d6b8-f6cf-4289-9857-20da0610d929-kube-api-access-78rmk\") pod \"community-operators-pbx4n\" (UID: \"7022d6b8-f6cf-4289-9857-20da0610d929\") " pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:30 crc kubenswrapper[4594]: I1129 06:14:30.913385 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pbx4n"] Nov 29 06:14:30 crc kubenswrapper[4594]: I1129 06:14:30.999430 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7022d6b8-f6cf-4289-9857-20da0610d929-utilities\") pod \"community-operators-pbx4n\" (UID: \"7022d6b8-f6cf-4289-9857-20da0610d929\") " pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:30 crc kubenswrapper[4594]: I1129 06:14:30.999504 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78rmk\" (UniqueName: \"kubernetes.io/projected/7022d6b8-f6cf-4289-9857-20da0610d929-kube-api-access-78rmk\") pod \"community-operators-pbx4n\" (UID: \"7022d6b8-f6cf-4289-9857-20da0610d929\") " pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:30 crc kubenswrapper[4594]: I1129 06:14:30.999608 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7022d6b8-f6cf-4289-9857-20da0610d929-catalog-content\") pod \"community-operators-pbx4n\" (UID: \"7022d6b8-f6cf-4289-9857-20da0610d929\") " pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:31 crc kubenswrapper[4594]: I1129 06:14:31.000347 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7022d6b8-f6cf-4289-9857-20da0610d929-utilities\") pod \"community-operators-pbx4n\" (UID: \"7022d6b8-f6cf-4289-9857-20da0610d929\") " pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:31 crc kubenswrapper[4594]: I1129 06:14:31.000522 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7022d6b8-f6cf-4289-9857-20da0610d929-catalog-content\") pod \"community-operators-pbx4n\" (UID: \"7022d6b8-f6cf-4289-9857-20da0610d929\") " pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:31 crc kubenswrapper[4594]: I1129 06:14:31.021727 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78rmk\" (UniqueName: \"kubernetes.io/projected/7022d6b8-f6cf-4289-9857-20da0610d929-kube-api-access-78rmk\") pod \"community-operators-pbx4n\" (UID: \"7022d6b8-f6cf-4289-9857-20da0610d929\") " pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:31 crc kubenswrapper[4594]: I1129 06:14:31.224200 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:31 crc kubenswrapper[4594]: I1129 06:14:31.714067 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pbx4n"] Nov 29 06:14:32 crc kubenswrapper[4594]: I1129 06:14:32.068937 4594 generic.go:334] "Generic (PLEG): container finished" podID="7022d6b8-f6cf-4289-9857-20da0610d929" containerID="100d8efcc4e065d9801c50e76bd8b131290ff2f408ba86deb437cf78ec400e95" exitCode=0 Nov 29 06:14:32 crc kubenswrapper[4594]: I1129 06:14:32.069061 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbx4n" event={"ID":"7022d6b8-f6cf-4289-9857-20da0610d929","Type":"ContainerDied","Data":"100d8efcc4e065d9801c50e76bd8b131290ff2f408ba86deb437cf78ec400e95"} Nov 29 06:14:32 crc kubenswrapper[4594]: I1129 06:14:32.069392 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbx4n" event={"ID":"7022d6b8-f6cf-4289-9857-20da0610d929","Type":"ContainerStarted","Data":"76ff9ee78f33c2df2f9d5e1f16d5475191bda1787565d45a79bfae03df716e54"} Nov 29 06:14:34 crc kubenswrapper[4594]: I1129 06:14:34.092488 4594 generic.go:334] "Generic (PLEG): container finished" podID="7022d6b8-f6cf-4289-9857-20da0610d929" containerID="0f22d335a61f33bef60a7041567db89fcc17bf76e24a47ac7a07f758ab662e82" exitCode=0 Nov 29 06:14:34 crc kubenswrapper[4594]: I1129 06:14:34.093565 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbx4n" event={"ID":"7022d6b8-f6cf-4289-9857-20da0610d929","Type":"ContainerDied","Data":"0f22d335a61f33bef60a7041567db89fcc17bf76e24a47ac7a07f758ab662e82"} Nov 29 06:14:35 crc kubenswrapper[4594]: I1129 06:14:35.106378 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbx4n" event={"ID":"7022d6b8-f6cf-4289-9857-20da0610d929","Type":"ContainerStarted","Data":"89e867fc5ab262df8de0174d760b683f92d60391e1bd5092ee40af1b6c0743e7"} Nov 29 06:14:35 crc kubenswrapper[4594]: I1129 06:14:35.126221 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pbx4n" podStartSLOduration=2.616035886 podStartE2EDuration="5.126200946s" podCreationTimestamp="2025-11-29 06:14:30 +0000 UTC" firstStartedPulling="2025-11-29 06:14:32.074195635 +0000 UTC m=+2796.314704855" lastFinishedPulling="2025-11-29 06:14:34.584360695 +0000 UTC m=+2798.824869915" observedRunningTime="2025-11-29 06:14:35.124774444 +0000 UTC m=+2799.365283663" watchObservedRunningTime="2025-11-29 06:14:35.126200946 +0000 UTC m=+2799.366710166" Nov 29 06:14:41 crc kubenswrapper[4594]: I1129 06:14:41.224995 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:41 crc kubenswrapper[4594]: I1129 06:14:41.225499 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:41 crc kubenswrapper[4594]: I1129 06:14:41.264388 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:42 crc kubenswrapper[4594]: I1129 06:14:42.200361 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:42 crc kubenswrapper[4594]: I1129 06:14:42.250926 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pbx4n"] Nov 29 06:14:44 crc kubenswrapper[4594]: I1129 06:14:44.177214 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pbx4n" podUID="7022d6b8-f6cf-4289-9857-20da0610d929" containerName="registry-server" containerID="cri-o://89e867fc5ab262df8de0174d760b683f92d60391e1bd5092ee40af1b6c0743e7" gracePeriod=2 Nov 29 06:14:44 crc kubenswrapper[4594]: I1129 06:14:44.589941 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:44 crc kubenswrapper[4594]: I1129 06:14:44.789170 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7022d6b8-f6cf-4289-9857-20da0610d929-catalog-content\") pod \"7022d6b8-f6cf-4289-9857-20da0610d929\" (UID: \"7022d6b8-f6cf-4289-9857-20da0610d929\") " Nov 29 06:14:44 crc kubenswrapper[4594]: I1129 06:14:44.789234 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7022d6b8-f6cf-4289-9857-20da0610d929-utilities\") pod \"7022d6b8-f6cf-4289-9857-20da0610d929\" (UID: \"7022d6b8-f6cf-4289-9857-20da0610d929\") " Nov 29 06:14:44 crc kubenswrapper[4594]: I1129 06:14:44.789349 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78rmk\" (UniqueName: \"kubernetes.io/projected/7022d6b8-f6cf-4289-9857-20da0610d929-kube-api-access-78rmk\") pod \"7022d6b8-f6cf-4289-9857-20da0610d929\" (UID: \"7022d6b8-f6cf-4289-9857-20da0610d929\") " Nov 29 06:14:44 crc kubenswrapper[4594]: I1129 06:14:44.790045 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7022d6b8-f6cf-4289-9857-20da0610d929-utilities" (OuterVolumeSpecName: "utilities") pod "7022d6b8-f6cf-4289-9857-20da0610d929" (UID: "7022d6b8-f6cf-4289-9857-20da0610d929"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:14:44 crc kubenswrapper[4594]: I1129 06:14:44.795556 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7022d6b8-f6cf-4289-9857-20da0610d929-kube-api-access-78rmk" (OuterVolumeSpecName: "kube-api-access-78rmk") pod "7022d6b8-f6cf-4289-9857-20da0610d929" (UID: "7022d6b8-f6cf-4289-9857-20da0610d929"). InnerVolumeSpecName "kube-api-access-78rmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:14:44 crc kubenswrapper[4594]: I1129 06:14:44.832325 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7022d6b8-f6cf-4289-9857-20da0610d929-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7022d6b8-f6cf-4289-9857-20da0610d929" (UID: "7022d6b8-f6cf-4289-9857-20da0610d929"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:14:44 crc kubenswrapper[4594]: I1129 06:14:44.892867 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78rmk\" (UniqueName: \"kubernetes.io/projected/7022d6b8-f6cf-4289-9857-20da0610d929-kube-api-access-78rmk\") on node \"crc\" DevicePath \"\"" Nov 29 06:14:44 crc kubenswrapper[4594]: I1129 06:14:44.892897 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7022d6b8-f6cf-4289-9857-20da0610d929-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:14:44 crc kubenswrapper[4594]: I1129 06:14:44.892907 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7022d6b8-f6cf-4289-9857-20da0610d929-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.190998 4594 generic.go:334] "Generic (PLEG): container finished" podID="7022d6b8-f6cf-4289-9857-20da0610d929" containerID="89e867fc5ab262df8de0174d760b683f92d60391e1bd5092ee40af1b6c0743e7" exitCode=0 Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.191048 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbx4n" event={"ID":"7022d6b8-f6cf-4289-9857-20da0610d929","Type":"ContainerDied","Data":"89e867fc5ab262df8de0174d760b683f92d60391e1bd5092ee40af1b6c0743e7"} Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.191091 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pbx4n" event={"ID":"7022d6b8-f6cf-4289-9857-20da0610d929","Type":"ContainerDied","Data":"76ff9ee78f33c2df2f9d5e1f16d5475191bda1787565d45a79bfae03df716e54"} Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.191137 4594 scope.go:117] "RemoveContainer" containerID="89e867fc5ab262df8de0174d760b683f92d60391e1bd5092ee40af1b6c0743e7" Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.191140 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pbx4n" Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.211239 4594 scope.go:117] "RemoveContainer" containerID="0f22d335a61f33bef60a7041567db89fcc17bf76e24a47ac7a07f758ab662e82" Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.231283 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pbx4n"] Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.239837 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pbx4n"] Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.247120 4594 scope.go:117] "RemoveContainer" containerID="100d8efcc4e065d9801c50e76bd8b131290ff2f408ba86deb437cf78ec400e95" Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.278220 4594 scope.go:117] "RemoveContainer" containerID="89e867fc5ab262df8de0174d760b683f92d60391e1bd5092ee40af1b6c0743e7" Nov 29 06:14:45 crc kubenswrapper[4594]: E1129 06:14:45.278736 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e867fc5ab262df8de0174d760b683f92d60391e1bd5092ee40af1b6c0743e7\": container with ID starting with 89e867fc5ab262df8de0174d760b683f92d60391e1bd5092ee40af1b6c0743e7 not found: ID does not exist" containerID="89e867fc5ab262df8de0174d760b683f92d60391e1bd5092ee40af1b6c0743e7" Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.278786 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e867fc5ab262df8de0174d760b683f92d60391e1bd5092ee40af1b6c0743e7"} err="failed to get container status \"89e867fc5ab262df8de0174d760b683f92d60391e1bd5092ee40af1b6c0743e7\": rpc error: code = NotFound desc = could not find container \"89e867fc5ab262df8de0174d760b683f92d60391e1bd5092ee40af1b6c0743e7\": container with ID starting with 89e867fc5ab262df8de0174d760b683f92d60391e1bd5092ee40af1b6c0743e7 not found: ID does not exist" Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.278820 4594 scope.go:117] "RemoveContainer" containerID="0f22d335a61f33bef60a7041567db89fcc17bf76e24a47ac7a07f758ab662e82" Nov 29 06:14:45 crc kubenswrapper[4594]: E1129 06:14:45.279109 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f22d335a61f33bef60a7041567db89fcc17bf76e24a47ac7a07f758ab662e82\": container with ID starting with 0f22d335a61f33bef60a7041567db89fcc17bf76e24a47ac7a07f758ab662e82 not found: ID does not exist" containerID="0f22d335a61f33bef60a7041567db89fcc17bf76e24a47ac7a07f758ab662e82" Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.279137 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f22d335a61f33bef60a7041567db89fcc17bf76e24a47ac7a07f758ab662e82"} err="failed to get container status \"0f22d335a61f33bef60a7041567db89fcc17bf76e24a47ac7a07f758ab662e82\": rpc error: code = NotFound desc = could not find container \"0f22d335a61f33bef60a7041567db89fcc17bf76e24a47ac7a07f758ab662e82\": container with ID starting with 0f22d335a61f33bef60a7041567db89fcc17bf76e24a47ac7a07f758ab662e82 not found: ID does not exist" Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.279154 4594 scope.go:117] "RemoveContainer" containerID="100d8efcc4e065d9801c50e76bd8b131290ff2f408ba86deb437cf78ec400e95" Nov 29 06:14:45 crc kubenswrapper[4594]: E1129 06:14:45.279511 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"100d8efcc4e065d9801c50e76bd8b131290ff2f408ba86deb437cf78ec400e95\": container with ID starting with 100d8efcc4e065d9801c50e76bd8b131290ff2f408ba86deb437cf78ec400e95 not found: ID does not exist" containerID="100d8efcc4e065d9801c50e76bd8b131290ff2f408ba86deb437cf78ec400e95" Nov 29 06:14:45 crc kubenswrapper[4594]: I1129 06:14:45.279561 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"100d8efcc4e065d9801c50e76bd8b131290ff2f408ba86deb437cf78ec400e95"} err="failed to get container status \"100d8efcc4e065d9801c50e76bd8b131290ff2f408ba86deb437cf78ec400e95\": rpc error: code = NotFound desc = could not find container \"100d8efcc4e065d9801c50e76bd8b131290ff2f408ba86deb437cf78ec400e95\": container with ID starting with 100d8efcc4e065d9801c50e76bd8b131290ff2f408ba86deb437cf78ec400e95 not found: ID does not exist" Nov 29 06:14:46 crc kubenswrapper[4594]: I1129 06:14:46.094514 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7022d6b8-f6cf-4289-9857-20da0610d929" path="/var/lib/kubelet/pods/7022d6b8-f6cf-4289-9857-20da0610d929/volumes" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.142131 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97"] Nov 29 06:15:00 crc kubenswrapper[4594]: E1129 06:15:00.143187 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7022d6b8-f6cf-4289-9857-20da0610d929" containerName="registry-server" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.143206 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="7022d6b8-f6cf-4289-9857-20da0610d929" containerName="registry-server" Nov 29 06:15:00 crc kubenswrapper[4594]: E1129 06:15:00.143217 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7022d6b8-f6cf-4289-9857-20da0610d929" containerName="extract-utilities" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.143223 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="7022d6b8-f6cf-4289-9857-20da0610d929" containerName="extract-utilities" Nov 29 06:15:00 crc kubenswrapper[4594]: E1129 06:15:00.143274 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7022d6b8-f6cf-4289-9857-20da0610d929" containerName="extract-content" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.143280 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="7022d6b8-f6cf-4289-9857-20da0610d929" containerName="extract-content" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.143530 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="7022d6b8-f6cf-4289-9857-20da0610d929" containerName="registry-server" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.144339 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.145931 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.145988 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.161610 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97"] Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.229952 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-secret-volume\") pod \"collect-profiles-29406615-7gr97\" (UID: \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.230242 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmzjr\" (UniqueName: \"kubernetes.io/projected/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-kube-api-access-kmzjr\") pod \"collect-profiles-29406615-7gr97\" (UID: \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.230417 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-config-volume\") pod \"collect-profiles-29406615-7gr97\" (UID: \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.332729 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmzjr\" (UniqueName: \"kubernetes.io/projected/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-kube-api-access-kmzjr\") pod \"collect-profiles-29406615-7gr97\" (UID: \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.333050 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-config-volume\") pod \"collect-profiles-29406615-7gr97\" (UID: \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.333382 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-secret-volume\") pod \"collect-profiles-29406615-7gr97\" (UID: \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.334092 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-config-volume\") pod \"collect-profiles-29406615-7gr97\" (UID: \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.342466 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-secret-volume\") pod \"collect-profiles-29406615-7gr97\" (UID: \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.350592 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmzjr\" (UniqueName: \"kubernetes.io/projected/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-kube-api-access-kmzjr\") pod \"collect-profiles-29406615-7gr97\" (UID: \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.467448 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" Nov 29 06:15:00 crc kubenswrapper[4594]: I1129 06:15:00.903701 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97"] Nov 29 06:15:01 crc kubenswrapper[4594]: I1129 06:15:01.347832 4594 generic.go:334] "Generic (PLEG): container finished" podID="f34377fb-70b8-4ee2-b053-62bd1c12b2fa" containerID="890bf8d88ba462e003eaf0d919fbc904ce0a2000ca1acd80a2b70db616c10631" exitCode=0 Nov 29 06:15:01 crc kubenswrapper[4594]: I1129 06:15:01.347902 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" event={"ID":"f34377fb-70b8-4ee2-b053-62bd1c12b2fa","Type":"ContainerDied","Data":"890bf8d88ba462e003eaf0d919fbc904ce0a2000ca1acd80a2b70db616c10631"} Nov 29 06:15:01 crc kubenswrapper[4594]: I1129 06:15:01.347956 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" event={"ID":"f34377fb-70b8-4ee2-b053-62bd1c12b2fa","Type":"ContainerStarted","Data":"69f4ff846af2cdfe9b7b66edb376d1daff94220f15328cf40fe9ff134555d777"} Nov 29 06:15:02 crc kubenswrapper[4594]: I1129 06:15:02.698554 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" Nov 29 06:15:02 crc kubenswrapper[4594]: I1129 06:15:02.789675 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-config-volume\") pod \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\" (UID: \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\") " Nov 29 06:15:02 crc kubenswrapper[4594]: I1129 06:15:02.789798 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-secret-volume\") pod \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\" (UID: \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\") " Nov 29 06:15:02 crc kubenswrapper[4594]: I1129 06:15:02.789896 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmzjr\" (UniqueName: \"kubernetes.io/projected/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-kube-api-access-kmzjr\") pod \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\" (UID: \"f34377fb-70b8-4ee2-b053-62bd1c12b2fa\") " Nov 29 06:15:02 crc kubenswrapper[4594]: I1129 06:15:02.790503 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-config-volume" (OuterVolumeSpecName: "config-volume") pod "f34377fb-70b8-4ee2-b053-62bd1c12b2fa" (UID: "f34377fb-70b8-4ee2-b053-62bd1c12b2fa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 06:15:02 crc kubenswrapper[4594]: I1129 06:15:02.794835 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f34377fb-70b8-4ee2-b053-62bd1c12b2fa" (UID: "f34377fb-70b8-4ee2-b053-62bd1c12b2fa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:15:02 crc kubenswrapper[4594]: I1129 06:15:02.795544 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-kube-api-access-kmzjr" (OuterVolumeSpecName: "kube-api-access-kmzjr") pod "f34377fb-70b8-4ee2-b053-62bd1c12b2fa" (UID: "f34377fb-70b8-4ee2-b053-62bd1c12b2fa"). InnerVolumeSpecName "kube-api-access-kmzjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:15:02 crc kubenswrapper[4594]: I1129 06:15:02.892300 4594 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 06:15:02 crc kubenswrapper[4594]: I1129 06:15:02.892335 4594 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 06:15:02 crc kubenswrapper[4594]: I1129 06:15:02.892346 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmzjr\" (UniqueName: \"kubernetes.io/projected/f34377fb-70b8-4ee2-b053-62bd1c12b2fa-kube-api-access-kmzjr\") on node \"crc\" DevicePath \"\"" Nov 29 06:15:03 crc kubenswrapper[4594]: I1129 06:15:03.369120 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" event={"ID":"f34377fb-70b8-4ee2-b053-62bd1c12b2fa","Type":"ContainerDied","Data":"69f4ff846af2cdfe9b7b66edb376d1daff94220f15328cf40fe9ff134555d777"} Nov 29 06:15:03 crc kubenswrapper[4594]: I1129 06:15:03.369169 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69f4ff846af2cdfe9b7b66edb376d1daff94220f15328cf40fe9ff134555d777" Nov 29 06:15:03 crc kubenswrapper[4594]: I1129 06:15:03.369181 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406615-7gr97" Nov 29 06:15:03 crc kubenswrapper[4594]: I1129 06:15:03.774571 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck"] Nov 29 06:15:03 crc kubenswrapper[4594]: I1129 06:15:03.783607 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406570-47pck"] Nov 29 06:15:04 crc kubenswrapper[4594]: I1129 06:15:04.107454 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5059cab-0e22-479f-9079-b031f405e547" path="/var/lib/kubelet/pods/b5059cab-0e22-479f-9079-b031f405e547/volumes" Nov 29 06:15:10 crc kubenswrapper[4594]: I1129 06:15:10.387106 4594 scope.go:117] "RemoveContainer" containerID="8eae648ce95bfb31bd79da0a697d004da9431becd62d6f6dddac172b6a954eab" Nov 29 06:15:42 crc kubenswrapper[4594]: E1129 06:15:42.924224 4594 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.120:56012->192.168.25.120:45015: write tcp 192.168.25.120:56012->192.168.25.120:45015: write: broken pipe Nov 29 06:15:45 crc kubenswrapper[4594]: I1129 06:15:45.800294 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:15:45 crc kubenswrapper[4594]: I1129 06:15:45.800683 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:16:15 crc kubenswrapper[4594]: I1129 06:16:15.800066 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:16:15 crc kubenswrapper[4594]: I1129 06:16:15.800599 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:16:44 crc kubenswrapper[4594]: I1129 06:16:44.790226 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gqg5z"] Nov 29 06:16:44 crc kubenswrapper[4594]: E1129 06:16:44.794954 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34377fb-70b8-4ee2-b053-62bd1c12b2fa" containerName="collect-profiles" Nov 29 06:16:44 crc kubenswrapper[4594]: I1129 06:16:44.794980 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34377fb-70b8-4ee2-b053-62bd1c12b2fa" containerName="collect-profiles" Nov 29 06:16:44 crc kubenswrapper[4594]: I1129 06:16:44.795177 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34377fb-70b8-4ee2-b053-62bd1c12b2fa" containerName="collect-profiles" Nov 29 06:16:44 crc kubenswrapper[4594]: I1129 06:16:44.797187 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:16:44 crc kubenswrapper[4594]: I1129 06:16:44.798126 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqg5z"] Nov 29 06:16:44 crc kubenswrapper[4594]: I1129 06:16:44.955593 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f2zj\" (UniqueName: \"kubernetes.io/projected/9561a595-3ec9-4888-929c-fd68470080a0-kube-api-access-4f2zj\") pod \"redhat-operators-gqg5z\" (UID: \"9561a595-3ec9-4888-929c-fd68470080a0\") " pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:16:44 crc kubenswrapper[4594]: I1129 06:16:44.955673 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9561a595-3ec9-4888-929c-fd68470080a0-utilities\") pod \"redhat-operators-gqg5z\" (UID: \"9561a595-3ec9-4888-929c-fd68470080a0\") " pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:16:44 crc kubenswrapper[4594]: I1129 06:16:44.955709 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9561a595-3ec9-4888-929c-fd68470080a0-catalog-content\") pod \"redhat-operators-gqg5z\" (UID: \"9561a595-3ec9-4888-929c-fd68470080a0\") " pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:16:45 crc kubenswrapper[4594]: I1129 06:16:45.059615 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f2zj\" (UniqueName: \"kubernetes.io/projected/9561a595-3ec9-4888-929c-fd68470080a0-kube-api-access-4f2zj\") pod \"redhat-operators-gqg5z\" (UID: \"9561a595-3ec9-4888-929c-fd68470080a0\") " pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:16:45 crc kubenswrapper[4594]: I1129 06:16:45.060175 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9561a595-3ec9-4888-929c-fd68470080a0-utilities\") pod \"redhat-operators-gqg5z\" (UID: \"9561a595-3ec9-4888-929c-fd68470080a0\") " pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:16:45 crc kubenswrapper[4594]: I1129 06:16:45.060292 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9561a595-3ec9-4888-929c-fd68470080a0-catalog-content\") pod \"redhat-operators-gqg5z\" (UID: \"9561a595-3ec9-4888-929c-fd68470080a0\") " pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:16:45 crc kubenswrapper[4594]: I1129 06:16:45.060739 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9561a595-3ec9-4888-929c-fd68470080a0-utilities\") pod \"redhat-operators-gqg5z\" (UID: \"9561a595-3ec9-4888-929c-fd68470080a0\") " pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:16:45 crc kubenswrapper[4594]: I1129 06:16:45.060794 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9561a595-3ec9-4888-929c-fd68470080a0-catalog-content\") pod \"redhat-operators-gqg5z\" (UID: \"9561a595-3ec9-4888-929c-fd68470080a0\") " pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:16:45 crc kubenswrapper[4594]: I1129 06:16:45.084418 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f2zj\" (UniqueName: \"kubernetes.io/projected/9561a595-3ec9-4888-929c-fd68470080a0-kube-api-access-4f2zj\") pod \"redhat-operators-gqg5z\" (UID: \"9561a595-3ec9-4888-929c-fd68470080a0\") " pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:16:45 crc kubenswrapper[4594]: I1129 06:16:45.126524 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:16:45 crc kubenswrapper[4594]: I1129 06:16:45.548588 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqg5z"] Nov 29 06:16:45 crc kubenswrapper[4594]: I1129 06:16:45.800708 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:16:45 crc kubenswrapper[4594]: I1129 06:16:45.800966 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:16:45 crc kubenswrapper[4594]: I1129 06:16:45.801017 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 06:16:45 crc kubenswrapper[4594]: I1129 06:16:45.801976 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"df364d319c5e658882315707e2d6b30a3b4e584bc43035131218f0fc9ff1f120"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 06:16:45 crc kubenswrapper[4594]: I1129 06:16:45.802032 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://df364d319c5e658882315707e2d6b30a3b4e584bc43035131218f0fc9ff1f120" gracePeriod=600 Nov 29 06:16:46 crc kubenswrapper[4594]: I1129 06:16:46.357418 4594 generic.go:334] "Generic (PLEG): container finished" podID="9561a595-3ec9-4888-929c-fd68470080a0" containerID="c2c3a5c357e36ae31b15c9ab01c376bfcc695856c2a10159c6d869cb6f8be31d" exitCode=0 Nov 29 06:16:46 crc kubenswrapper[4594]: I1129 06:16:46.357511 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqg5z" event={"ID":"9561a595-3ec9-4888-929c-fd68470080a0","Type":"ContainerDied","Data":"c2c3a5c357e36ae31b15c9ab01c376bfcc695856c2a10159c6d869cb6f8be31d"} Nov 29 06:16:46 crc kubenswrapper[4594]: I1129 06:16:46.358022 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqg5z" event={"ID":"9561a595-3ec9-4888-929c-fd68470080a0","Type":"ContainerStarted","Data":"0e68b77fdb42414139b6f78b7f9c202ef4f67ee0a5af6d6b593b1a41092801f5"} Nov 29 06:16:46 crc kubenswrapper[4594]: I1129 06:16:46.361101 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="df364d319c5e658882315707e2d6b30a3b4e584bc43035131218f0fc9ff1f120" exitCode=0 Nov 29 06:16:46 crc kubenswrapper[4594]: I1129 06:16:46.361163 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"df364d319c5e658882315707e2d6b30a3b4e584bc43035131218f0fc9ff1f120"} Nov 29 06:16:46 crc kubenswrapper[4594]: I1129 06:16:46.361208 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919"} Nov 29 06:16:46 crc kubenswrapper[4594]: I1129 06:16:46.361236 4594 scope.go:117] "RemoveContainer" containerID="546461d6f586f0461ba676d35ee2bc142491a0e28d0e8ba8d66c6c317b14d8f9" Nov 29 06:16:48 crc kubenswrapper[4594]: I1129 06:16:48.392157 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqg5z" event={"ID":"9561a595-3ec9-4888-929c-fd68470080a0","Type":"ContainerStarted","Data":"0e37866660d9ad07907c072b61877f1bcfe3b633bab033e679b842333d14a0ca"} Nov 29 06:16:50 crc kubenswrapper[4594]: I1129 06:16:50.413429 4594 generic.go:334] "Generic (PLEG): container finished" podID="9561a595-3ec9-4888-929c-fd68470080a0" containerID="0e37866660d9ad07907c072b61877f1bcfe3b633bab033e679b842333d14a0ca" exitCode=0 Nov 29 06:16:50 crc kubenswrapper[4594]: I1129 06:16:50.413539 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqg5z" event={"ID":"9561a595-3ec9-4888-929c-fd68470080a0","Type":"ContainerDied","Data":"0e37866660d9ad07907c072b61877f1bcfe3b633bab033e679b842333d14a0ca"} Nov 29 06:16:51 crc kubenswrapper[4594]: I1129 06:16:51.425846 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqg5z" event={"ID":"9561a595-3ec9-4888-929c-fd68470080a0","Type":"ContainerStarted","Data":"a754a3170f3747393213f37679084686d1517c61d13305db83e354d6222669b3"} Nov 29 06:16:51 crc kubenswrapper[4594]: I1129 06:16:51.443836 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gqg5z" podStartSLOduration=2.824839446 podStartE2EDuration="7.443821556s" podCreationTimestamp="2025-11-29 06:16:44 +0000 UTC" firstStartedPulling="2025-11-29 06:16:46.360967288 +0000 UTC m=+2930.601476508" lastFinishedPulling="2025-11-29 06:16:50.979949408 +0000 UTC m=+2935.220458618" observedRunningTime="2025-11-29 06:16:51.44369605 +0000 UTC m=+2935.684205260" watchObservedRunningTime="2025-11-29 06:16:51.443821556 +0000 UTC m=+2935.684330776" Nov 29 06:16:55 crc kubenswrapper[4594]: I1129 06:16:55.126760 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:16:55 crc kubenswrapper[4594]: I1129 06:16:55.128408 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:16:56 crc kubenswrapper[4594]: I1129 06:16:56.173895 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gqg5z" podUID="9561a595-3ec9-4888-929c-fd68470080a0" containerName="registry-server" probeResult="failure" output=< Nov 29 06:16:56 crc kubenswrapper[4594]: timeout: failed to connect service ":50051" within 1s Nov 29 06:16:56 crc kubenswrapper[4594]: > Nov 29 06:17:05 crc kubenswrapper[4594]: I1129 06:17:05.166229 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:17:05 crc kubenswrapper[4594]: I1129 06:17:05.202400 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:17:05 crc kubenswrapper[4594]: I1129 06:17:05.404846 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqg5z"] Nov 29 06:17:06 crc kubenswrapper[4594]: I1129 06:17:06.570513 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gqg5z" podUID="9561a595-3ec9-4888-929c-fd68470080a0" containerName="registry-server" containerID="cri-o://a754a3170f3747393213f37679084686d1517c61d13305db83e354d6222669b3" gracePeriod=2 Nov 29 06:17:06 crc kubenswrapper[4594]: I1129 06:17:06.933837 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.063310 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f2zj\" (UniqueName: \"kubernetes.io/projected/9561a595-3ec9-4888-929c-fd68470080a0-kube-api-access-4f2zj\") pod \"9561a595-3ec9-4888-929c-fd68470080a0\" (UID: \"9561a595-3ec9-4888-929c-fd68470080a0\") " Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.063451 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9561a595-3ec9-4888-929c-fd68470080a0-utilities\") pod \"9561a595-3ec9-4888-929c-fd68470080a0\" (UID: \"9561a595-3ec9-4888-929c-fd68470080a0\") " Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.063886 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9561a595-3ec9-4888-929c-fd68470080a0-catalog-content\") pod \"9561a595-3ec9-4888-929c-fd68470080a0\" (UID: \"9561a595-3ec9-4888-929c-fd68470080a0\") " Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.064137 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9561a595-3ec9-4888-929c-fd68470080a0-utilities" (OuterVolumeSpecName: "utilities") pod "9561a595-3ec9-4888-929c-fd68470080a0" (UID: "9561a595-3ec9-4888-929c-fd68470080a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.064815 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9561a595-3ec9-4888-929c-fd68470080a0-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.071463 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9561a595-3ec9-4888-929c-fd68470080a0-kube-api-access-4f2zj" (OuterVolumeSpecName: "kube-api-access-4f2zj") pod "9561a595-3ec9-4888-929c-fd68470080a0" (UID: "9561a595-3ec9-4888-929c-fd68470080a0"). InnerVolumeSpecName "kube-api-access-4f2zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.146673 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9561a595-3ec9-4888-929c-fd68470080a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9561a595-3ec9-4888-929c-fd68470080a0" (UID: "9561a595-3ec9-4888-929c-fd68470080a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.168241 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9561a595-3ec9-4888-929c-fd68470080a0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.168288 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f2zj\" (UniqueName: \"kubernetes.io/projected/9561a595-3ec9-4888-929c-fd68470080a0-kube-api-access-4f2zj\") on node \"crc\" DevicePath \"\"" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.582098 4594 generic.go:334] "Generic (PLEG): container finished" podID="9561a595-3ec9-4888-929c-fd68470080a0" containerID="a754a3170f3747393213f37679084686d1517c61d13305db83e354d6222669b3" exitCode=0 Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.582149 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqg5z" event={"ID":"9561a595-3ec9-4888-929c-fd68470080a0","Type":"ContainerDied","Data":"a754a3170f3747393213f37679084686d1517c61d13305db83e354d6222669b3"} Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.582186 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqg5z" event={"ID":"9561a595-3ec9-4888-929c-fd68470080a0","Type":"ContainerDied","Data":"0e68b77fdb42414139b6f78b7f9c202ef4f67ee0a5af6d6b593b1a41092801f5"} Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.582207 4594 scope.go:117] "RemoveContainer" containerID="a754a3170f3747393213f37679084686d1517c61d13305db83e354d6222669b3" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.582208 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqg5z" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.609446 4594 scope.go:117] "RemoveContainer" containerID="0e37866660d9ad07907c072b61877f1bcfe3b633bab033e679b842333d14a0ca" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.613367 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqg5z"] Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.620765 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gqg5z"] Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.652243 4594 scope.go:117] "RemoveContainer" containerID="c2c3a5c357e36ae31b15c9ab01c376bfcc695856c2a10159c6d869cb6f8be31d" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.670510 4594 scope.go:117] "RemoveContainer" containerID="a754a3170f3747393213f37679084686d1517c61d13305db83e354d6222669b3" Nov 29 06:17:07 crc kubenswrapper[4594]: E1129 06:17:07.670855 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a754a3170f3747393213f37679084686d1517c61d13305db83e354d6222669b3\": container with ID starting with a754a3170f3747393213f37679084686d1517c61d13305db83e354d6222669b3 not found: ID does not exist" containerID="a754a3170f3747393213f37679084686d1517c61d13305db83e354d6222669b3" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.670905 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a754a3170f3747393213f37679084686d1517c61d13305db83e354d6222669b3"} err="failed to get container status \"a754a3170f3747393213f37679084686d1517c61d13305db83e354d6222669b3\": rpc error: code = NotFound desc = could not find container \"a754a3170f3747393213f37679084686d1517c61d13305db83e354d6222669b3\": container with ID starting with a754a3170f3747393213f37679084686d1517c61d13305db83e354d6222669b3 not found: ID does not exist" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.670930 4594 scope.go:117] "RemoveContainer" containerID="0e37866660d9ad07907c072b61877f1bcfe3b633bab033e679b842333d14a0ca" Nov 29 06:17:07 crc kubenswrapper[4594]: E1129 06:17:07.671246 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e37866660d9ad07907c072b61877f1bcfe3b633bab033e679b842333d14a0ca\": container with ID starting with 0e37866660d9ad07907c072b61877f1bcfe3b633bab033e679b842333d14a0ca not found: ID does not exist" containerID="0e37866660d9ad07907c072b61877f1bcfe3b633bab033e679b842333d14a0ca" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.671395 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e37866660d9ad07907c072b61877f1bcfe3b633bab033e679b842333d14a0ca"} err="failed to get container status \"0e37866660d9ad07907c072b61877f1bcfe3b633bab033e679b842333d14a0ca\": rpc error: code = NotFound desc = could not find container \"0e37866660d9ad07907c072b61877f1bcfe3b633bab033e679b842333d14a0ca\": container with ID starting with 0e37866660d9ad07907c072b61877f1bcfe3b633bab033e679b842333d14a0ca not found: ID does not exist" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.671438 4594 scope.go:117] "RemoveContainer" containerID="c2c3a5c357e36ae31b15c9ab01c376bfcc695856c2a10159c6d869cb6f8be31d" Nov 29 06:17:07 crc kubenswrapper[4594]: E1129 06:17:07.672332 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2c3a5c357e36ae31b15c9ab01c376bfcc695856c2a10159c6d869cb6f8be31d\": container with ID starting with c2c3a5c357e36ae31b15c9ab01c376bfcc695856c2a10159c6d869cb6f8be31d not found: ID does not exist" containerID="c2c3a5c357e36ae31b15c9ab01c376bfcc695856c2a10159c6d869cb6f8be31d" Nov 29 06:17:07 crc kubenswrapper[4594]: I1129 06:17:07.672373 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2c3a5c357e36ae31b15c9ab01c376bfcc695856c2a10159c6d869cb6f8be31d"} err="failed to get container status \"c2c3a5c357e36ae31b15c9ab01c376bfcc695856c2a10159c6d869cb6f8be31d\": rpc error: code = NotFound desc = could not find container \"c2c3a5c357e36ae31b15c9ab01c376bfcc695856c2a10159c6d869cb6f8be31d\": container with ID starting with c2c3a5c357e36ae31b15c9ab01c376bfcc695856c2a10159c6d869cb6f8be31d not found: ID does not exist" Nov 29 06:17:08 crc kubenswrapper[4594]: I1129 06:17:08.095968 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9561a595-3ec9-4888-929c-fd68470080a0" path="/var/lib/kubelet/pods/9561a595-3ec9-4888-929c-fd68470080a0/volumes" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:53.999665 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-phxcr"] Nov 29 06:17:54 crc kubenswrapper[4594]: E1129 06:17:54.000846 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9561a595-3ec9-4888-929c-fd68470080a0" containerName="extract-utilities" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.000862 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="9561a595-3ec9-4888-929c-fd68470080a0" containerName="extract-utilities" Nov 29 06:17:54 crc kubenswrapper[4594]: E1129 06:17:54.000875 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9561a595-3ec9-4888-929c-fd68470080a0" containerName="registry-server" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.000882 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="9561a595-3ec9-4888-929c-fd68470080a0" containerName="registry-server" Nov 29 06:17:54 crc kubenswrapper[4594]: E1129 06:17:54.000893 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9561a595-3ec9-4888-929c-fd68470080a0" containerName="extract-content" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.000899 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="9561a595-3ec9-4888-929c-fd68470080a0" containerName="extract-content" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.001084 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="9561a595-3ec9-4888-929c-fd68470080a0" containerName="registry-server" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.003088 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.010295 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-phxcr"] Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.053044 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kds5q\" (UniqueName: \"kubernetes.io/projected/6708eeb4-701a-4071-907c-f13930dbf468-kube-api-access-kds5q\") pod \"certified-operators-phxcr\" (UID: \"6708eeb4-701a-4071-907c-f13930dbf468\") " pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.053195 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6708eeb4-701a-4071-907c-f13930dbf468-catalog-content\") pod \"certified-operators-phxcr\" (UID: \"6708eeb4-701a-4071-907c-f13930dbf468\") " pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.053672 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6708eeb4-701a-4071-907c-f13930dbf468-utilities\") pod \"certified-operators-phxcr\" (UID: \"6708eeb4-701a-4071-907c-f13930dbf468\") " pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.158315 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kds5q\" (UniqueName: \"kubernetes.io/projected/6708eeb4-701a-4071-907c-f13930dbf468-kube-api-access-kds5q\") pod \"certified-operators-phxcr\" (UID: \"6708eeb4-701a-4071-907c-f13930dbf468\") " pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.158378 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6708eeb4-701a-4071-907c-f13930dbf468-catalog-content\") pod \"certified-operators-phxcr\" (UID: \"6708eeb4-701a-4071-907c-f13930dbf468\") " pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.158446 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6708eeb4-701a-4071-907c-f13930dbf468-utilities\") pod \"certified-operators-phxcr\" (UID: \"6708eeb4-701a-4071-907c-f13930dbf468\") " pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.158963 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6708eeb4-701a-4071-907c-f13930dbf468-utilities\") pod \"certified-operators-phxcr\" (UID: \"6708eeb4-701a-4071-907c-f13930dbf468\") " pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.159044 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6708eeb4-701a-4071-907c-f13930dbf468-catalog-content\") pod \"certified-operators-phxcr\" (UID: \"6708eeb4-701a-4071-907c-f13930dbf468\") " pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.185724 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kds5q\" (UniqueName: \"kubernetes.io/projected/6708eeb4-701a-4071-907c-f13930dbf468-kube-api-access-kds5q\") pod \"certified-operators-phxcr\" (UID: \"6708eeb4-701a-4071-907c-f13930dbf468\") " pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.321699 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:17:54 crc kubenswrapper[4594]: I1129 06:17:54.785215 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-phxcr"] Nov 29 06:17:55 crc kubenswrapper[4594]: I1129 06:17:55.071858 4594 generic.go:334] "Generic (PLEG): container finished" podID="6708eeb4-701a-4071-907c-f13930dbf468" containerID="1b5667000561cd87097c56356e8b3d6ca462d8c737fd65d7e23f602009a77ac3" exitCode=0 Nov 29 06:17:55 crc kubenswrapper[4594]: I1129 06:17:55.072053 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phxcr" event={"ID":"6708eeb4-701a-4071-907c-f13930dbf468","Type":"ContainerDied","Data":"1b5667000561cd87097c56356e8b3d6ca462d8c737fd65d7e23f602009a77ac3"} Nov 29 06:17:55 crc kubenswrapper[4594]: I1129 06:17:55.072180 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phxcr" event={"ID":"6708eeb4-701a-4071-907c-f13930dbf468","Type":"ContainerStarted","Data":"57f41860273cd2763aa6b945ad4345474044f0a2192a8b481748ae8271221628"} Nov 29 06:17:55 crc kubenswrapper[4594]: I1129 06:17:55.074212 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 06:17:57 crc kubenswrapper[4594]: I1129 06:17:57.102315 4594 generic.go:334] "Generic (PLEG): container finished" podID="6708eeb4-701a-4071-907c-f13930dbf468" containerID="bad22c46007edcdc56182e89a338768971a26beddacc516631571dec3a4d7647" exitCode=0 Nov 29 06:17:57 crc kubenswrapper[4594]: I1129 06:17:57.102884 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phxcr" event={"ID":"6708eeb4-701a-4071-907c-f13930dbf468","Type":"ContainerDied","Data":"bad22c46007edcdc56182e89a338768971a26beddacc516631571dec3a4d7647"} Nov 29 06:17:58 crc kubenswrapper[4594]: I1129 06:17:58.117722 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phxcr" event={"ID":"6708eeb4-701a-4071-907c-f13930dbf468","Type":"ContainerStarted","Data":"1248ece46cc0b476041b58174a721a36637f006169fb60aea8cc705da618e5ef"} Nov 29 06:17:58 crc kubenswrapper[4594]: I1129 06:17:58.143110 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-phxcr" podStartSLOduration=2.5498393889999997 podStartE2EDuration="5.143083144s" podCreationTimestamp="2025-11-29 06:17:53 +0000 UTC" firstStartedPulling="2025-11-29 06:17:55.073942686 +0000 UTC m=+2999.314451906" lastFinishedPulling="2025-11-29 06:17:57.667186441 +0000 UTC m=+3001.907695661" observedRunningTime="2025-11-29 06:17:58.13709634 +0000 UTC m=+3002.377605559" watchObservedRunningTime="2025-11-29 06:17:58.143083144 +0000 UTC m=+3002.383592365" Nov 29 06:18:04 crc kubenswrapper[4594]: I1129 06:18:04.323000 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:18:04 crc kubenswrapper[4594]: I1129 06:18:04.323570 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:18:04 crc kubenswrapper[4594]: I1129 06:18:04.363061 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:18:05 crc kubenswrapper[4594]: I1129 06:18:05.242750 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:18:05 crc kubenswrapper[4594]: I1129 06:18:05.773323 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-phxcr"] Nov 29 06:18:07 crc kubenswrapper[4594]: I1129 06:18:07.236007 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-phxcr" podUID="6708eeb4-701a-4071-907c-f13930dbf468" containerName="registry-server" containerID="cri-o://1248ece46cc0b476041b58174a721a36637f006169fb60aea8cc705da618e5ef" gracePeriod=2 Nov 29 06:18:07 crc kubenswrapper[4594]: I1129 06:18:07.680087 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:18:07 crc kubenswrapper[4594]: I1129 06:18:07.789310 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6708eeb4-701a-4071-907c-f13930dbf468-catalog-content\") pod \"6708eeb4-701a-4071-907c-f13930dbf468\" (UID: \"6708eeb4-701a-4071-907c-f13930dbf468\") " Nov 29 06:18:07 crc kubenswrapper[4594]: I1129 06:18:07.789476 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kds5q\" (UniqueName: \"kubernetes.io/projected/6708eeb4-701a-4071-907c-f13930dbf468-kube-api-access-kds5q\") pod \"6708eeb4-701a-4071-907c-f13930dbf468\" (UID: \"6708eeb4-701a-4071-907c-f13930dbf468\") " Nov 29 06:18:07 crc kubenswrapper[4594]: I1129 06:18:07.789700 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6708eeb4-701a-4071-907c-f13930dbf468-utilities\") pod \"6708eeb4-701a-4071-907c-f13930dbf468\" (UID: \"6708eeb4-701a-4071-907c-f13930dbf468\") " Nov 29 06:18:07 crc kubenswrapper[4594]: I1129 06:18:07.790828 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6708eeb4-701a-4071-907c-f13930dbf468-utilities" (OuterVolumeSpecName: "utilities") pod "6708eeb4-701a-4071-907c-f13930dbf468" (UID: "6708eeb4-701a-4071-907c-f13930dbf468"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:18:07 crc kubenswrapper[4594]: I1129 06:18:07.795216 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6708eeb4-701a-4071-907c-f13930dbf468-kube-api-access-kds5q" (OuterVolumeSpecName: "kube-api-access-kds5q") pod "6708eeb4-701a-4071-907c-f13930dbf468" (UID: "6708eeb4-701a-4071-907c-f13930dbf468"). InnerVolumeSpecName "kube-api-access-kds5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:18:07 crc kubenswrapper[4594]: I1129 06:18:07.830173 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6708eeb4-701a-4071-907c-f13930dbf468-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6708eeb4-701a-4071-907c-f13930dbf468" (UID: "6708eeb4-701a-4071-907c-f13930dbf468"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:18:07 crc kubenswrapper[4594]: I1129 06:18:07.892359 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6708eeb4-701a-4071-907c-f13930dbf468-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:18:07 crc kubenswrapper[4594]: I1129 06:18:07.892386 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6708eeb4-701a-4071-907c-f13930dbf468-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:18:07 crc kubenswrapper[4594]: I1129 06:18:07.892400 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kds5q\" (UniqueName: \"kubernetes.io/projected/6708eeb4-701a-4071-907c-f13930dbf468-kube-api-access-kds5q\") on node \"crc\" DevicePath \"\"" Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.249751 4594 generic.go:334] "Generic (PLEG): container finished" podID="6708eeb4-701a-4071-907c-f13930dbf468" containerID="1248ece46cc0b476041b58174a721a36637f006169fb60aea8cc705da618e5ef" exitCode=0 Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.249806 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phxcr" event={"ID":"6708eeb4-701a-4071-907c-f13930dbf468","Type":"ContainerDied","Data":"1248ece46cc0b476041b58174a721a36637f006169fb60aea8cc705da618e5ef"} Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.249851 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phxcr" event={"ID":"6708eeb4-701a-4071-907c-f13930dbf468","Type":"ContainerDied","Data":"57f41860273cd2763aa6b945ad4345474044f0a2192a8b481748ae8271221628"} Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.249857 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phxcr" Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.249877 4594 scope.go:117] "RemoveContainer" containerID="1248ece46cc0b476041b58174a721a36637f006169fb60aea8cc705da618e5ef" Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.276437 4594 scope.go:117] "RemoveContainer" containerID="bad22c46007edcdc56182e89a338768971a26beddacc516631571dec3a4d7647" Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.276774 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-phxcr"] Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.288545 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-phxcr"] Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.293151 4594 scope.go:117] "RemoveContainer" containerID="1b5667000561cd87097c56356e8b3d6ca462d8c737fd65d7e23f602009a77ac3" Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.335998 4594 scope.go:117] "RemoveContainer" containerID="1248ece46cc0b476041b58174a721a36637f006169fb60aea8cc705da618e5ef" Nov 29 06:18:08 crc kubenswrapper[4594]: E1129 06:18:08.336439 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1248ece46cc0b476041b58174a721a36637f006169fb60aea8cc705da618e5ef\": container with ID starting with 1248ece46cc0b476041b58174a721a36637f006169fb60aea8cc705da618e5ef not found: ID does not exist" containerID="1248ece46cc0b476041b58174a721a36637f006169fb60aea8cc705da618e5ef" Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.336472 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1248ece46cc0b476041b58174a721a36637f006169fb60aea8cc705da618e5ef"} err="failed to get container status \"1248ece46cc0b476041b58174a721a36637f006169fb60aea8cc705da618e5ef\": rpc error: code = NotFound desc = could not find container \"1248ece46cc0b476041b58174a721a36637f006169fb60aea8cc705da618e5ef\": container with ID starting with 1248ece46cc0b476041b58174a721a36637f006169fb60aea8cc705da618e5ef not found: ID does not exist" Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.336495 4594 scope.go:117] "RemoveContainer" containerID="bad22c46007edcdc56182e89a338768971a26beddacc516631571dec3a4d7647" Nov 29 06:18:08 crc kubenswrapper[4594]: E1129 06:18:08.336844 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bad22c46007edcdc56182e89a338768971a26beddacc516631571dec3a4d7647\": container with ID starting with bad22c46007edcdc56182e89a338768971a26beddacc516631571dec3a4d7647 not found: ID does not exist" containerID="bad22c46007edcdc56182e89a338768971a26beddacc516631571dec3a4d7647" Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.336932 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bad22c46007edcdc56182e89a338768971a26beddacc516631571dec3a4d7647"} err="failed to get container status \"bad22c46007edcdc56182e89a338768971a26beddacc516631571dec3a4d7647\": rpc error: code = NotFound desc = could not find container \"bad22c46007edcdc56182e89a338768971a26beddacc516631571dec3a4d7647\": container with ID starting with bad22c46007edcdc56182e89a338768971a26beddacc516631571dec3a4d7647 not found: ID does not exist" Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.337009 4594 scope.go:117] "RemoveContainer" containerID="1b5667000561cd87097c56356e8b3d6ca462d8c737fd65d7e23f602009a77ac3" Nov 29 06:18:08 crc kubenswrapper[4594]: E1129 06:18:08.337352 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b5667000561cd87097c56356e8b3d6ca462d8c737fd65d7e23f602009a77ac3\": container with ID starting with 1b5667000561cd87097c56356e8b3d6ca462d8c737fd65d7e23f602009a77ac3 not found: ID does not exist" containerID="1b5667000561cd87097c56356e8b3d6ca462d8c737fd65d7e23f602009a77ac3" Nov 29 06:18:08 crc kubenswrapper[4594]: I1129 06:18:08.337379 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b5667000561cd87097c56356e8b3d6ca462d8c737fd65d7e23f602009a77ac3"} err="failed to get container status \"1b5667000561cd87097c56356e8b3d6ca462d8c737fd65d7e23f602009a77ac3\": rpc error: code = NotFound desc = could not find container \"1b5667000561cd87097c56356e8b3d6ca462d8c737fd65d7e23f602009a77ac3\": container with ID starting with 1b5667000561cd87097c56356e8b3d6ca462d8c737fd65d7e23f602009a77ac3 not found: ID does not exist" Nov 29 06:18:10 crc kubenswrapper[4594]: I1129 06:18:10.098694 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6708eeb4-701a-4071-907c-f13930dbf468" path="/var/lib/kubelet/pods/6708eeb4-701a-4071-907c-f13930dbf468/volumes" Nov 29 06:19:15 crc kubenswrapper[4594]: I1129 06:19:15.800631 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:19:15 crc kubenswrapper[4594]: I1129 06:19:15.801052 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:19:45 crc kubenswrapper[4594]: I1129 06:19:45.800559 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:19:45 crc kubenswrapper[4594]: I1129 06:19:45.802074 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:20:15 crc kubenswrapper[4594]: I1129 06:20:15.800572 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:20:15 crc kubenswrapper[4594]: I1129 06:20:15.801156 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:20:15 crc kubenswrapper[4594]: I1129 06:20:15.801218 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 06:20:15 crc kubenswrapper[4594]: I1129 06:20:15.801994 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 06:20:15 crc kubenswrapper[4594]: I1129 06:20:15.802045 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" gracePeriod=600 Nov 29 06:20:15 crc kubenswrapper[4594]: E1129 06:20:15.922509 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:20:16 crc kubenswrapper[4594]: I1129 06:20:16.628950 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" exitCode=0 Nov 29 06:20:16 crc kubenswrapper[4594]: I1129 06:20:16.629007 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919"} Nov 29 06:20:16 crc kubenswrapper[4594]: I1129 06:20:16.629062 4594 scope.go:117] "RemoveContainer" containerID="df364d319c5e658882315707e2d6b30a3b4e584bc43035131218f0fc9ff1f120" Nov 29 06:20:16 crc kubenswrapper[4594]: I1129 06:20:16.629982 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:20:16 crc kubenswrapper[4594]: E1129 06:20:16.630582 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:20:30 crc kubenswrapper[4594]: I1129 06:20:30.084353 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:20:30 crc kubenswrapper[4594]: E1129 06:20:30.085356 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:20:44 crc kubenswrapper[4594]: I1129 06:20:44.084718 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:20:44 crc kubenswrapper[4594]: E1129 06:20:44.086247 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:20:58 crc kubenswrapper[4594]: I1129 06:20:58.084552 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:20:58 crc kubenswrapper[4594]: E1129 06:20:58.085710 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:21:09 crc kubenswrapper[4594]: I1129 06:21:09.083583 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:21:09 crc kubenswrapper[4594]: E1129 06:21:09.084451 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:21:21 crc kubenswrapper[4594]: I1129 06:21:21.083077 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:21:21 crc kubenswrapper[4594]: E1129 06:21:21.084434 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:21:33 crc kubenswrapper[4594]: I1129 06:21:33.084302 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:21:33 crc kubenswrapper[4594]: E1129 06:21:33.085209 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:21:46 crc kubenswrapper[4594]: I1129 06:21:46.091332 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:21:46 crc kubenswrapper[4594]: E1129 06:21:46.092572 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:22:01 crc kubenswrapper[4594]: I1129 06:22:01.083865 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:22:01 crc kubenswrapper[4594]: E1129 06:22:01.085662 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:22:14 crc kubenswrapper[4594]: I1129 06:22:14.083483 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:22:14 crc kubenswrapper[4594]: E1129 06:22:14.085215 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:22:29 crc kubenswrapper[4594]: I1129 06:22:29.083887 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:22:29 crc kubenswrapper[4594]: E1129 06:22:29.085313 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:22:41 crc kubenswrapper[4594]: I1129 06:22:41.083582 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:22:41 crc kubenswrapper[4594]: E1129 06:22:41.084672 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.318415 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x4phq"] Nov 29 06:22:48 crc kubenswrapper[4594]: E1129 06:22:48.319772 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6708eeb4-701a-4071-907c-f13930dbf468" containerName="extract-content" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.319791 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="6708eeb4-701a-4071-907c-f13930dbf468" containerName="extract-content" Nov 29 06:22:48 crc kubenswrapper[4594]: E1129 06:22:48.319829 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6708eeb4-701a-4071-907c-f13930dbf468" containerName="registry-server" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.319835 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="6708eeb4-701a-4071-907c-f13930dbf468" containerName="registry-server" Nov 29 06:22:48 crc kubenswrapper[4594]: E1129 06:22:48.319870 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6708eeb4-701a-4071-907c-f13930dbf468" containerName="extract-utilities" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.319877 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="6708eeb4-701a-4071-907c-f13930dbf468" containerName="extract-utilities" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.320136 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="6708eeb4-701a-4071-907c-f13930dbf468" containerName="registry-server" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.322100 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.328945 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4phq"] Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.388676 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62sds\" (UniqueName: \"kubernetes.io/projected/a72b4f19-1095-4efe-8a47-702b419b741e-kube-api-access-62sds\") pod \"redhat-marketplace-x4phq\" (UID: \"a72b4f19-1095-4efe-8a47-702b419b741e\") " pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.389169 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72b4f19-1095-4efe-8a47-702b419b741e-catalog-content\") pod \"redhat-marketplace-x4phq\" (UID: \"a72b4f19-1095-4efe-8a47-702b419b741e\") " pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.389294 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72b4f19-1095-4efe-8a47-702b419b741e-utilities\") pod \"redhat-marketplace-x4phq\" (UID: \"a72b4f19-1095-4efe-8a47-702b419b741e\") " pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.490985 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72b4f19-1095-4efe-8a47-702b419b741e-utilities\") pod \"redhat-marketplace-x4phq\" (UID: \"a72b4f19-1095-4efe-8a47-702b419b741e\") " pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.491229 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62sds\" (UniqueName: \"kubernetes.io/projected/a72b4f19-1095-4efe-8a47-702b419b741e-kube-api-access-62sds\") pod \"redhat-marketplace-x4phq\" (UID: \"a72b4f19-1095-4efe-8a47-702b419b741e\") " pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.491306 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72b4f19-1095-4efe-8a47-702b419b741e-catalog-content\") pod \"redhat-marketplace-x4phq\" (UID: \"a72b4f19-1095-4efe-8a47-702b419b741e\") " pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.491736 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72b4f19-1095-4efe-8a47-702b419b741e-utilities\") pod \"redhat-marketplace-x4phq\" (UID: \"a72b4f19-1095-4efe-8a47-702b419b741e\") " pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.491802 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72b4f19-1095-4efe-8a47-702b419b741e-catalog-content\") pod \"redhat-marketplace-x4phq\" (UID: \"a72b4f19-1095-4efe-8a47-702b419b741e\") " pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.511713 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62sds\" (UniqueName: \"kubernetes.io/projected/a72b4f19-1095-4efe-8a47-702b419b741e-kube-api-access-62sds\") pod \"redhat-marketplace-x4phq\" (UID: \"a72b4f19-1095-4efe-8a47-702b419b741e\") " pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:48 crc kubenswrapper[4594]: I1129 06:22:48.641278 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:49 crc kubenswrapper[4594]: I1129 06:22:49.077692 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4phq"] Nov 29 06:22:49 crc kubenswrapper[4594]: I1129 06:22:49.150854 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4phq" event={"ID":"a72b4f19-1095-4efe-8a47-702b419b741e","Type":"ContainerStarted","Data":"4dc3d19a6755810a8b8877f859776a79f3128a4401de6751043b64751934018d"} Nov 29 06:22:50 crc kubenswrapper[4594]: I1129 06:22:50.163141 4594 generic.go:334] "Generic (PLEG): container finished" podID="a72b4f19-1095-4efe-8a47-702b419b741e" containerID="f9214ddeb2e450b274d6aadadc74d86c04e96ce9333d222ee02faf4b2fd687c5" exitCode=0 Nov 29 06:22:50 crc kubenswrapper[4594]: I1129 06:22:50.163241 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4phq" event={"ID":"a72b4f19-1095-4efe-8a47-702b419b741e","Type":"ContainerDied","Data":"f9214ddeb2e450b274d6aadadc74d86c04e96ce9333d222ee02faf4b2fd687c5"} Nov 29 06:22:51 crc kubenswrapper[4594]: I1129 06:22:51.173741 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4phq" event={"ID":"a72b4f19-1095-4efe-8a47-702b419b741e","Type":"ContainerStarted","Data":"a6ecbe8306411a0de400f0a06eac739fc3cc70b22aba4c42bf98b461c82ea2bb"} Nov 29 06:22:52 crc kubenswrapper[4594]: I1129 06:22:52.184095 4594 generic.go:334] "Generic (PLEG): container finished" podID="a72b4f19-1095-4efe-8a47-702b419b741e" containerID="a6ecbe8306411a0de400f0a06eac739fc3cc70b22aba4c42bf98b461c82ea2bb" exitCode=0 Nov 29 06:22:52 crc kubenswrapper[4594]: I1129 06:22:52.184144 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4phq" event={"ID":"a72b4f19-1095-4efe-8a47-702b419b741e","Type":"ContainerDied","Data":"a6ecbe8306411a0de400f0a06eac739fc3cc70b22aba4c42bf98b461c82ea2bb"} Nov 29 06:22:53 crc kubenswrapper[4594]: I1129 06:22:53.198571 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4phq" event={"ID":"a72b4f19-1095-4efe-8a47-702b419b741e","Type":"ContainerStarted","Data":"54d803a43c39f304a08e0053fb765c1c1f697f221d1b3aa31a045b4f04081dc4"} Nov 29 06:22:53 crc kubenswrapper[4594]: I1129 06:22:53.219527 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x4phq" podStartSLOduration=2.6853314040000003 podStartE2EDuration="5.219479659s" podCreationTimestamp="2025-11-29 06:22:48 +0000 UTC" firstStartedPulling="2025-11-29 06:22:50.165402168 +0000 UTC m=+3294.405911388" lastFinishedPulling="2025-11-29 06:22:52.699550423 +0000 UTC m=+3296.940059643" observedRunningTime="2025-11-29 06:22:53.215052597 +0000 UTC m=+3297.455561817" watchObservedRunningTime="2025-11-29 06:22:53.219479659 +0000 UTC m=+3297.459988870" Nov 29 06:22:55 crc kubenswrapper[4594]: I1129 06:22:55.083909 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:22:55 crc kubenswrapper[4594]: E1129 06:22:55.084546 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:22:58 crc kubenswrapper[4594]: I1129 06:22:58.641698 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:58 crc kubenswrapper[4594]: I1129 06:22:58.642436 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:58 crc kubenswrapper[4594]: I1129 06:22:58.682745 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:59 crc kubenswrapper[4594]: I1129 06:22:59.310000 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:22:59 crc kubenswrapper[4594]: I1129 06:22:59.353513 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4phq"] Nov 29 06:23:01 crc kubenswrapper[4594]: I1129 06:23:01.291017 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x4phq" podUID="a72b4f19-1095-4efe-8a47-702b419b741e" containerName="registry-server" containerID="cri-o://54d803a43c39f304a08e0053fb765c1c1f697f221d1b3aa31a045b4f04081dc4" gracePeriod=2 Nov 29 06:23:01 crc kubenswrapper[4594]: I1129 06:23:01.716563 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:23:01 crc kubenswrapper[4594]: I1129 06:23:01.901623 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72b4f19-1095-4efe-8a47-702b419b741e-catalog-content\") pod \"a72b4f19-1095-4efe-8a47-702b419b741e\" (UID: \"a72b4f19-1095-4efe-8a47-702b419b741e\") " Nov 29 06:23:01 crc kubenswrapper[4594]: I1129 06:23:01.901951 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72b4f19-1095-4efe-8a47-702b419b741e-utilities\") pod \"a72b4f19-1095-4efe-8a47-702b419b741e\" (UID: \"a72b4f19-1095-4efe-8a47-702b419b741e\") " Nov 29 06:23:01 crc kubenswrapper[4594]: I1129 06:23:01.902086 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62sds\" (UniqueName: \"kubernetes.io/projected/a72b4f19-1095-4efe-8a47-702b419b741e-kube-api-access-62sds\") pod \"a72b4f19-1095-4efe-8a47-702b419b741e\" (UID: \"a72b4f19-1095-4efe-8a47-702b419b741e\") " Nov 29 06:23:01 crc kubenswrapper[4594]: I1129 06:23:01.902839 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72b4f19-1095-4efe-8a47-702b419b741e-utilities" (OuterVolumeSpecName: "utilities") pod "a72b4f19-1095-4efe-8a47-702b419b741e" (UID: "a72b4f19-1095-4efe-8a47-702b419b741e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:23:01 crc kubenswrapper[4594]: I1129 06:23:01.909469 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72b4f19-1095-4efe-8a47-702b419b741e-kube-api-access-62sds" (OuterVolumeSpecName: "kube-api-access-62sds") pod "a72b4f19-1095-4efe-8a47-702b419b741e" (UID: "a72b4f19-1095-4efe-8a47-702b419b741e"). InnerVolumeSpecName "kube-api-access-62sds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:23:01 crc kubenswrapper[4594]: I1129 06:23:01.916186 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72b4f19-1095-4efe-8a47-702b419b741e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a72b4f19-1095-4efe-8a47-702b419b741e" (UID: "a72b4f19-1095-4efe-8a47-702b419b741e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.004958 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72b4f19-1095-4efe-8a47-702b419b741e-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.004989 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62sds\" (UniqueName: \"kubernetes.io/projected/a72b4f19-1095-4efe-8a47-702b419b741e-kube-api-access-62sds\") on node \"crc\" DevicePath \"\"" Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.005001 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72b4f19-1095-4efe-8a47-702b419b741e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.302002 4594 generic.go:334] "Generic (PLEG): container finished" podID="a72b4f19-1095-4efe-8a47-702b419b741e" containerID="54d803a43c39f304a08e0053fb765c1c1f697f221d1b3aa31a045b4f04081dc4" exitCode=0 Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.302093 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4phq" Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.302091 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4phq" event={"ID":"a72b4f19-1095-4efe-8a47-702b419b741e","Type":"ContainerDied","Data":"54d803a43c39f304a08e0053fb765c1c1f697f221d1b3aa31a045b4f04081dc4"} Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.303236 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4phq" event={"ID":"a72b4f19-1095-4efe-8a47-702b419b741e","Type":"ContainerDied","Data":"4dc3d19a6755810a8b8877f859776a79f3128a4401de6751043b64751934018d"} Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.303284 4594 scope.go:117] "RemoveContainer" containerID="54d803a43c39f304a08e0053fb765c1c1f697f221d1b3aa31a045b4f04081dc4" Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.329986 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4phq"] Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.330680 4594 scope.go:117] "RemoveContainer" containerID="a6ecbe8306411a0de400f0a06eac739fc3cc70b22aba4c42bf98b461c82ea2bb" Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.339211 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4phq"] Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.363362 4594 scope.go:117] "RemoveContainer" containerID="f9214ddeb2e450b274d6aadadc74d86c04e96ce9333d222ee02faf4b2fd687c5" Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.404164 4594 scope.go:117] "RemoveContainer" containerID="54d803a43c39f304a08e0053fb765c1c1f697f221d1b3aa31a045b4f04081dc4" Nov 29 06:23:02 crc kubenswrapper[4594]: E1129 06:23:02.405440 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d803a43c39f304a08e0053fb765c1c1f697f221d1b3aa31a045b4f04081dc4\": container with ID starting with 54d803a43c39f304a08e0053fb765c1c1f697f221d1b3aa31a045b4f04081dc4 not found: ID does not exist" containerID="54d803a43c39f304a08e0053fb765c1c1f697f221d1b3aa31a045b4f04081dc4" Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.405499 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d803a43c39f304a08e0053fb765c1c1f697f221d1b3aa31a045b4f04081dc4"} err="failed to get container status \"54d803a43c39f304a08e0053fb765c1c1f697f221d1b3aa31a045b4f04081dc4\": rpc error: code = NotFound desc = could not find container \"54d803a43c39f304a08e0053fb765c1c1f697f221d1b3aa31a045b4f04081dc4\": container with ID starting with 54d803a43c39f304a08e0053fb765c1c1f697f221d1b3aa31a045b4f04081dc4 not found: ID does not exist" Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.405538 4594 scope.go:117] "RemoveContainer" containerID="a6ecbe8306411a0de400f0a06eac739fc3cc70b22aba4c42bf98b461c82ea2bb" Nov 29 06:23:02 crc kubenswrapper[4594]: E1129 06:23:02.405987 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6ecbe8306411a0de400f0a06eac739fc3cc70b22aba4c42bf98b461c82ea2bb\": container with ID starting with a6ecbe8306411a0de400f0a06eac739fc3cc70b22aba4c42bf98b461c82ea2bb not found: ID does not exist" containerID="a6ecbe8306411a0de400f0a06eac739fc3cc70b22aba4c42bf98b461c82ea2bb" Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.406029 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ecbe8306411a0de400f0a06eac739fc3cc70b22aba4c42bf98b461c82ea2bb"} err="failed to get container status \"a6ecbe8306411a0de400f0a06eac739fc3cc70b22aba4c42bf98b461c82ea2bb\": rpc error: code = NotFound desc = could not find container \"a6ecbe8306411a0de400f0a06eac739fc3cc70b22aba4c42bf98b461c82ea2bb\": container with ID starting with a6ecbe8306411a0de400f0a06eac739fc3cc70b22aba4c42bf98b461c82ea2bb not found: ID does not exist" Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.406064 4594 scope.go:117] "RemoveContainer" containerID="f9214ddeb2e450b274d6aadadc74d86c04e96ce9333d222ee02faf4b2fd687c5" Nov 29 06:23:02 crc kubenswrapper[4594]: E1129 06:23:02.406459 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9214ddeb2e450b274d6aadadc74d86c04e96ce9333d222ee02faf4b2fd687c5\": container with ID starting with f9214ddeb2e450b274d6aadadc74d86c04e96ce9333d222ee02faf4b2fd687c5 not found: ID does not exist" containerID="f9214ddeb2e450b274d6aadadc74d86c04e96ce9333d222ee02faf4b2fd687c5" Nov 29 06:23:02 crc kubenswrapper[4594]: I1129 06:23:02.406535 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9214ddeb2e450b274d6aadadc74d86c04e96ce9333d222ee02faf4b2fd687c5"} err="failed to get container status \"f9214ddeb2e450b274d6aadadc74d86c04e96ce9333d222ee02faf4b2fd687c5\": rpc error: code = NotFound desc = could not find container \"f9214ddeb2e450b274d6aadadc74d86c04e96ce9333d222ee02faf4b2fd687c5\": container with ID starting with f9214ddeb2e450b274d6aadadc74d86c04e96ce9333d222ee02faf4b2fd687c5 not found: ID does not exist" Nov 29 06:23:04 crc kubenswrapper[4594]: I1129 06:23:04.103886 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a72b4f19-1095-4efe-8a47-702b419b741e" path="/var/lib/kubelet/pods/a72b4f19-1095-4efe-8a47-702b419b741e/volumes" Nov 29 06:23:07 crc kubenswrapper[4594]: I1129 06:23:07.083664 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:23:07 crc kubenswrapper[4594]: E1129 06:23:07.084567 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:23:18 crc kubenswrapper[4594]: I1129 06:23:18.084220 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:23:18 crc kubenswrapper[4594]: E1129 06:23:18.085164 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:23:33 crc kubenswrapper[4594]: I1129 06:23:33.084380 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:23:33 crc kubenswrapper[4594]: E1129 06:23:33.085392 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:23:44 crc kubenswrapper[4594]: I1129 06:23:44.084348 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:23:44 crc kubenswrapper[4594]: E1129 06:23:44.085425 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:23:58 crc kubenswrapper[4594]: I1129 06:23:58.083850 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:23:58 crc kubenswrapper[4594]: E1129 06:23:58.084685 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:24:11 crc kubenswrapper[4594]: I1129 06:24:11.083778 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:24:11 crc kubenswrapper[4594]: E1129 06:24:11.086764 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:24:23 crc kubenswrapper[4594]: I1129 06:24:23.084077 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:24:23 crc kubenswrapper[4594]: E1129 06:24:23.084987 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.640966 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hqrxt"] Nov 29 06:24:30 crc kubenswrapper[4594]: E1129 06:24:30.642033 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72b4f19-1095-4efe-8a47-702b419b741e" containerName="registry-server" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.642050 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72b4f19-1095-4efe-8a47-702b419b741e" containerName="registry-server" Nov 29 06:24:30 crc kubenswrapper[4594]: E1129 06:24:30.642076 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72b4f19-1095-4efe-8a47-702b419b741e" containerName="extract-content" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.642082 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72b4f19-1095-4efe-8a47-702b419b741e" containerName="extract-content" Nov 29 06:24:30 crc kubenswrapper[4594]: E1129 06:24:30.642112 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72b4f19-1095-4efe-8a47-702b419b741e" containerName="extract-utilities" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.642119 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72b4f19-1095-4efe-8a47-702b419b741e" containerName="extract-utilities" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.643089 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72b4f19-1095-4efe-8a47-702b419b741e" containerName="registry-server" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.644875 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.656627 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hqrxt"] Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.680497 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxll2\" (UniqueName: \"kubernetes.io/projected/b9f1bdf8-188a-4fdb-bf69-41579c5827ce-kube-api-access-dxll2\") pod \"community-operators-hqrxt\" (UID: \"b9f1bdf8-188a-4fdb-bf69-41579c5827ce\") " pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.680681 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9f1bdf8-188a-4fdb-bf69-41579c5827ce-utilities\") pod \"community-operators-hqrxt\" (UID: \"b9f1bdf8-188a-4fdb-bf69-41579c5827ce\") " pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.680978 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9f1bdf8-188a-4fdb-bf69-41579c5827ce-catalog-content\") pod \"community-operators-hqrxt\" (UID: \"b9f1bdf8-188a-4fdb-bf69-41579c5827ce\") " pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.783420 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9f1bdf8-188a-4fdb-bf69-41579c5827ce-utilities\") pod \"community-operators-hqrxt\" (UID: \"b9f1bdf8-188a-4fdb-bf69-41579c5827ce\") " pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.783647 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9f1bdf8-188a-4fdb-bf69-41579c5827ce-catalog-content\") pod \"community-operators-hqrxt\" (UID: \"b9f1bdf8-188a-4fdb-bf69-41579c5827ce\") " pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.783764 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxll2\" (UniqueName: \"kubernetes.io/projected/b9f1bdf8-188a-4fdb-bf69-41579c5827ce-kube-api-access-dxll2\") pod \"community-operators-hqrxt\" (UID: \"b9f1bdf8-188a-4fdb-bf69-41579c5827ce\") " pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.783985 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9f1bdf8-188a-4fdb-bf69-41579c5827ce-utilities\") pod \"community-operators-hqrxt\" (UID: \"b9f1bdf8-188a-4fdb-bf69-41579c5827ce\") " pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.784055 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9f1bdf8-188a-4fdb-bf69-41579c5827ce-catalog-content\") pod \"community-operators-hqrxt\" (UID: \"b9f1bdf8-188a-4fdb-bf69-41579c5827ce\") " pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.802083 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxll2\" (UniqueName: \"kubernetes.io/projected/b9f1bdf8-188a-4fdb-bf69-41579c5827ce-kube-api-access-dxll2\") pod \"community-operators-hqrxt\" (UID: \"b9f1bdf8-188a-4fdb-bf69-41579c5827ce\") " pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:30 crc kubenswrapper[4594]: I1129 06:24:30.962138 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:31 crc kubenswrapper[4594]: I1129 06:24:31.444245 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hqrxt"] Nov 29 06:24:32 crc kubenswrapper[4594]: I1129 06:24:32.219805 4594 generic.go:334] "Generic (PLEG): container finished" podID="b9f1bdf8-188a-4fdb-bf69-41579c5827ce" containerID="f2ce493da201709373c8129cca382fe92e2e8ce86d74be135bcb0c2c9069fd5d" exitCode=0 Nov 29 06:24:32 crc kubenswrapper[4594]: I1129 06:24:32.219878 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqrxt" event={"ID":"b9f1bdf8-188a-4fdb-bf69-41579c5827ce","Type":"ContainerDied","Data":"f2ce493da201709373c8129cca382fe92e2e8ce86d74be135bcb0c2c9069fd5d"} Nov 29 06:24:32 crc kubenswrapper[4594]: I1129 06:24:32.219923 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqrxt" event={"ID":"b9f1bdf8-188a-4fdb-bf69-41579c5827ce","Type":"ContainerStarted","Data":"61f4cef81ea9b5579b27c553de19386453144a6b11ce9243d0c2efc879956d2d"} Nov 29 06:24:32 crc kubenswrapper[4594]: I1129 06:24:32.222545 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 06:24:37 crc kubenswrapper[4594]: I1129 06:24:37.295497 4594 generic.go:334] "Generic (PLEG): container finished" podID="b9f1bdf8-188a-4fdb-bf69-41579c5827ce" containerID="23a586dcc952a45fa0bfe16e6e02a737b470009ac72ae370658865fa47bec107" exitCode=0 Nov 29 06:24:37 crc kubenswrapper[4594]: I1129 06:24:37.296273 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqrxt" event={"ID":"b9f1bdf8-188a-4fdb-bf69-41579c5827ce","Type":"ContainerDied","Data":"23a586dcc952a45fa0bfe16e6e02a737b470009ac72ae370658865fa47bec107"} Nov 29 06:24:38 crc kubenswrapper[4594]: I1129 06:24:38.083395 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:24:38 crc kubenswrapper[4594]: E1129 06:24:38.084042 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:24:38 crc kubenswrapper[4594]: I1129 06:24:38.309819 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqrxt" event={"ID":"b9f1bdf8-188a-4fdb-bf69-41579c5827ce","Type":"ContainerStarted","Data":"9c845707edb8c44d0072b457b2a963395211d30efb4a3c2e8e1d89f3da4cfb04"} Nov 29 06:24:38 crc kubenswrapper[4594]: I1129 06:24:38.334090 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hqrxt" podStartSLOduration=2.674407914 podStartE2EDuration="8.334069791s" podCreationTimestamp="2025-11-29 06:24:30 +0000 UTC" firstStartedPulling="2025-11-29 06:24:32.222239269 +0000 UTC m=+3396.462748480" lastFinishedPulling="2025-11-29 06:24:37.881901146 +0000 UTC m=+3402.122410357" observedRunningTime="2025-11-29 06:24:38.326467139 +0000 UTC m=+3402.566976349" watchObservedRunningTime="2025-11-29 06:24:38.334069791 +0000 UTC m=+3402.574579012" Nov 29 06:24:40 crc kubenswrapper[4594]: I1129 06:24:40.962483 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:40 crc kubenswrapper[4594]: I1129 06:24:40.963358 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:41 crc kubenswrapper[4594]: I1129 06:24:41.002612 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:49 crc kubenswrapper[4594]: I1129 06:24:49.083302 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:24:49 crc kubenswrapper[4594]: E1129 06:24:49.084203 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.006944 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hqrxt" Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.060160 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hqrxt"] Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.106229 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwhtm"] Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.106594 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zwhtm" podUID="0f40ab58-b977-4f3d-a122-691a05dd14cf" containerName="registry-server" containerID="cri-o://d6d5b0495122c81e21e6a5beafb23514d6c3a77723831a29fc164acf18301c84" gracePeriod=2 Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.452504 4594 generic.go:334] "Generic (PLEG): container finished" podID="0f40ab58-b977-4f3d-a122-691a05dd14cf" containerID="d6d5b0495122c81e21e6a5beafb23514d6c3a77723831a29fc164acf18301c84" exitCode=0 Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.452627 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwhtm" event={"ID":"0f40ab58-b977-4f3d-a122-691a05dd14cf","Type":"ContainerDied","Data":"d6d5b0495122c81e21e6a5beafb23514d6c3a77723831a29fc164acf18301c84"} Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.565753 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwhtm" Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.673289 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f40ab58-b977-4f3d-a122-691a05dd14cf-utilities\") pod \"0f40ab58-b977-4f3d-a122-691a05dd14cf\" (UID: \"0f40ab58-b977-4f3d-a122-691a05dd14cf\") " Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.673406 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f40ab58-b977-4f3d-a122-691a05dd14cf-catalog-content\") pod \"0f40ab58-b977-4f3d-a122-691a05dd14cf\" (UID: \"0f40ab58-b977-4f3d-a122-691a05dd14cf\") " Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.673593 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8h27\" (UniqueName: \"kubernetes.io/projected/0f40ab58-b977-4f3d-a122-691a05dd14cf-kube-api-access-b8h27\") pod \"0f40ab58-b977-4f3d-a122-691a05dd14cf\" (UID: \"0f40ab58-b977-4f3d-a122-691a05dd14cf\") " Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.674658 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f40ab58-b977-4f3d-a122-691a05dd14cf-utilities" (OuterVolumeSpecName: "utilities") pod "0f40ab58-b977-4f3d-a122-691a05dd14cf" (UID: "0f40ab58-b977-4f3d-a122-691a05dd14cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.680168 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f40ab58-b977-4f3d-a122-691a05dd14cf-kube-api-access-b8h27" (OuterVolumeSpecName: "kube-api-access-b8h27") pod "0f40ab58-b977-4f3d-a122-691a05dd14cf" (UID: "0f40ab58-b977-4f3d-a122-691a05dd14cf"). InnerVolumeSpecName "kube-api-access-b8h27". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.713200 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f40ab58-b977-4f3d-a122-691a05dd14cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f40ab58-b977-4f3d-a122-691a05dd14cf" (UID: "0f40ab58-b977-4f3d-a122-691a05dd14cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.775727 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f40ab58-b977-4f3d-a122-691a05dd14cf-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.775758 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8h27\" (UniqueName: \"kubernetes.io/projected/0f40ab58-b977-4f3d-a122-691a05dd14cf-kube-api-access-b8h27\") on node \"crc\" DevicePath \"\"" Nov 29 06:24:51 crc kubenswrapper[4594]: I1129 06:24:51.775771 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f40ab58-b977-4f3d-a122-691a05dd14cf-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:24:52 crc kubenswrapper[4594]: I1129 06:24:52.467894 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwhtm" event={"ID":"0f40ab58-b977-4f3d-a122-691a05dd14cf","Type":"ContainerDied","Data":"ae25a0dba737b8a61408467a5091c86f36df56ce3564dea6dd123633f01a0cbb"} Nov 29 06:24:52 crc kubenswrapper[4594]: I1129 06:24:52.468211 4594 scope.go:117] "RemoveContainer" containerID="d6d5b0495122c81e21e6a5beafb23514d6c3a77723831a29fc164acf18301c84" Nov 29 06:24:52 crc kubenswrapper[4594]: I1129 06:24:52.467990 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwhtm" Nov 29 06:24:52 crc kubenswrapper[4594]: I1129 06:24:52.495075 4594 scope.go:117] "RemoveContainer" containerID="4417d6f8cc464a2dde368fceb8d5825095c6e7dd207568017107c130fd998799" Nov 29 06:24:52 crc kubenswrapper[4594]: I1129 06:24:52.495832 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwhtm"] Nov 29 06:24:52 crc kubenswrapper[4594]: I1129 06:24:52.504051 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zwhtm"] Nov 29 06:24:52 crc kubenswrapper[4594]: I1129 06:24:52.517015 4594 scope.go:117] "RemoveContainer" containerID="b081875b24f2557ba78bf4ca604cc83ca645e99e38c74ad794067a174bdad3cf" Nov 29 06:24:54 crc kubenswrapper[4594]: I1129 06:24:54.093954 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f40ab58-b977-4f3d-a122-691a05dd14cf" path="/var/lib/kubelet/pods/0f40ab58-b977-4f3d-a122-691a05dd14cf/volumes" Nov 29 06:25:02 crc kubenswrapper[4594]: I1129 06:25:02.083043 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:25:02 crc kubenswrapper[4594]: E1129 06:25:02.083968 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:25:17 crc kubenswrapper[4594]: I1129 06:25:17.083456 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:25:17 crc kubenswrapper[4594]: I1129 06:25:17.746076 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"b53fc2e43ea71880847371ec8848d6b5ccb0f0cb3990cf3b77060a262ca641e5"} Nov 29 06:27:45 crc kubenswrapper[4594]: I1129 06:27:45.800092 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:27:45 crc kubenswrapper[4594]: I1129 06:27:45.800673 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:28:15 crc kubenswrapper[4594]: I1129 06:28:15.800032 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:28:15 crc kubenswrapper[4594]: I1129 06:28:15.800603 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:28:45 crc kubenswrapper[4594]: I1129 06:28:45.800665 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:28:45 crc kubenswrapper[4594]: I1129 06:28:45.801339 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:28:45 crc kubenswrapper[4594]: I1129 06:28:45.801390 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 06:28:45 crc kubenswrapper[4594]: I1129 06:28:45.802508 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b53fc2e43ea71880847371ec8848d6b5ccb0f0cb3990cf3b77060a262ca641e5"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 06:28:45 crc kubenswrapper[4594]: I1129 06:28:45.802612 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://b53fc2e43ea71880847371ec8848d6b5ccb0f0cb3990cf3b77060a262ca641e5" gracePeriod=600 Nov 29 06:28:46 crc kubenswrapper[4594]: I1129 06:28:46.743990 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="b53fc2e43ea71880847371ec8848d6b5ccb0f0cb3990cf3b77060a262ca641e5" exitCode=0 Nov 29 06:28:46 crc kubenswrapper[4594]: I1129 06:28:46.744103 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"b53fc2e43ea71880847371ec8848d6b5ccb0f0cb3990cf3b77060a262ca641e5"} Nov 29 06:28:46 crc kubenswrapper[4594]: I1129 06:28:46.744607 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3"} Nov 29 06:28:46 crc kubenswrapper[4594]: I1129 06:28:46.744634 4594 scope.go:117] "RemoveContainer" containerID="b7d448c5be68a1e0abebb2f8f28d640c84bc9ec06c2c2355d2d751cdd4a23919" Nov 29 06:29:18 crc kubenswrapper[4594]: I1129 06:29:18.972536 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4hw86"] Nov 29 06:29:18 crc kubenswrapper[4594]: E1129 06:29:18.973550 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f40ab58-b977-4f3d-a122-691a05dd14cf" containerName="registry-server" Nov 29 06:29:18 crc kubenswrapper[4594]: I1129 06:29:18.973565 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f40ab58-b977-4f3d-a122-691a05dd14cf" containerName="registry-server" Nov 29 06:29:18 crc kubenswrapper[4594]: E1129 06:29:18.973580 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f40ab58-b977-4f3d-a122-691a05dd14cf" containerName="extract-utilities" Nov 29 06:29:18 crc kubenswrapper[4594]: I1129 06:29:18.973588 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f40ab58-b977-4f3d-a122-691a05dd14cf" containerName="extract-utilities" Nov 29 06:29:18 crc kubenswrapper[4594]: E1129 06:29:18.973604 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f40ab58-b977-4f3d-a122-691a05dd14cf" containerName="extract-content" Nov 29 06:29:18 crc kubenswrapper[4594]: I1129 06:29:18.973611 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f40ab58-b977-4f3d-a122-691a05dd14cf" containerName="extract-content" Nov 29 06:29:18 crc kubenswrapper[4594]: I1129 06:29:18.973888 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f40ab58-b977-4f3d-a122-691a05dd14cf" containerName="registry-server" Nov 29 06:29:18 crc kubenswrapper[4594]: I1129 06:29:18.975283 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:18 crc kubenswrapper[4594]: I1129 06:29:18.981485 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hw86"] Nov 29 06:29:19 crc kubenswrapper[4594]: I1129 06:29:19.039195 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gcb9\" (UniqueName: \"kubernetes.io/projected/0275fda6-178e-4685-a802-665793283eb2-kube-api-access-7gcb9\") pod \"certified-operators-4hw86\" (UID: \"0275fda6-178e-4685-a802-665793283eb2\") " pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:19 crc kubenswrapper[4594]: I1129 06:29:19.039644 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0275fda6-178e-4685-a802-665793283eb2-catalog-content\") pod \"certified-operators-4hw86\" (UID: \"0275fda6-178e-4685-a802-665793283eb2\") " pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:19 crc kubenswrapper[4594]: I1129 06:29:19.039948 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0275fda6-178e-4685-a802-665793283eb2-utilities\") pod \"certified-operators-4hw86\" (UID: \"0275fda6-178e-4685-a802-665793283eb2\") " pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:19 crc kubenswrapper[4594]: I1129 06:29:19.141914 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gcb9\" (UniqueName: \"kubernetes.io/projected/0275fda6-178e-4685-a802-665793283eb2-kube-api-access-7gcb9\") pod \"certified-operators-4hw86\" (UID: \"0275fda6-178e-4685-a802-665793283eb2\") " pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:19 crc kubenswrapper[4594]: I1129 06:29:19.142097 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0275fda6-178e-4685-a802-665793283eb2-catalog-content\") pod \"certified-operators-4hw86\" (UID: \"0275fda6-178e-4685-a802-665793283eb2\") " pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:19 crc kubenswrapper[4594]: I1129 06:29:19.142246 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0275fda6-178e-4685-a802-665793283eb2-utilities\") pod \"certified-operators-4hw86\" (UID: \"0275fda6-178e-4685-a802-665793283eb2\") " pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:19 crc kubenswrapper[4594]: I1129 06:29:19.143217 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0275fda6-178e-4685-a802-665793283eb2-catalog-content\") pod \"certified-operators-4hw86\" (UID: \"0275fda6-178e-4685-a802-665793283eb2\") " pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:19 crc kubenswrapper[4594]: I1129 06:29:19.143531 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0275fda6-178e-4685-a802-665793283eb2-utilities\") pod \"certified-operators-4hw86\" (UID: \"0275fda6-178e-4685-a802-665793283eb2\") " pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:19 crc kubenswrapper[4594]: I1129 06:29:19.166092 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gcb9\" (UniqueName: \"kubernetes.io/projected/0275fda6-178e-4685-a802-665793283eb2-kube-api-access-7gcb9\") pod \"certified-operators-4hw86\" (UID: \"0275fda6-178e-4685-a802-665793283eb2\") " pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:19 crc kubenswrapper[4594]: I1129 06:29:19.297055 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:19 crc kubenswrapper[4594]: I1129 06:29:19.741826 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hw86"] Nov 29 06:29:20 crc kubenswrapper[4594]: I1129 06:29:20.155466 4594 generic.go:334] "Generic (PLEG): container finished" podID="0275fda6-178e-4685-a802-665793283eb2" containerID="f82f06d599152a68b092d503f8caf68ecdd196be25aed5c31900ecc703dfe14f" exitCode=0 Nov 29 06:29:20 crc kubenswrapper[4594]: I1129 06:29:20.155578 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hw86" event={"ID":"0275fda6-178e-4685-a802-665793283eb2","Type":"ContainerDied","Data":"f82f06d599152a68b092d503f8caf68ecdd196be25aed5c31900ecc703dfe14f"} Nov 29 06:29:20 crc kubenswrapper[4594]: I1129 06:29:20.155802 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hw86" event={"ID":"0275fda6-178e-4685-a802-665793283eb2","Type":"ContainerStarted","Data":"8550c5aa739290911c29ed1c85db4f4b08afd693f1d45bd1f55e87b51b50fa1d"} Nov 29 06:29:21 crc kubenswrapper[4594]: I1129 06:29:21.165268 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hw86" event={"ID":"0275fda6-178e-4685-a802-665793283eb2","Type":"ContainerStarted","Data":"59ee939056412acc83d479608c166fbafd737e1283cf2344f2bb982e9d37cb9a"} Nov 29 06:29:22 crc kubenswrapper[4594]: I1129 06:29:22.175537 4594 generic.go:334] "Generic (PLEG): container finished" podID="0275fda6-178e-4685-a802-665793283eb2" containerID="59ee939056412acc83d479608c166fbafd737e1283cf2344f2bb982e9d37cb9a" exitCode=0 Nov 29 06:29:22 crc kubenswrapper[4594]: I1129 06:29:22.175648 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hw86" event={"ID":"0275fda6-178e-4685-a802-665793283eb2","Type":"ContainerDied","Data":"59ee939056412acc83d479608c166fbafd737e1283cf2344f2bb982e9d37cb9a"} Nov 29 06:29:23 crc kubenswrapper[4594]: I1129 06:29:23.191445 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hw86" event={"ID":"0275fda6-178e-4685-a802-665793283eb2","Type":"ContainerStarted","Data":"8b4ef09160f4be08882b7c1882546eae6f8606b87a2899eeb16fe80a36f70f0b"} Nov 29 06:29:23 crc kubenswrapper[4594]: I1129 06:29:23.217638 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4hw86" podStartSLOduration=2.567222823 podStartE2EDuration="5.217622022s" podCreationTimestamp="2025-11-29 06:29:18 +0000 UTC" firstStartedPulling="2025-11-29 06:29:20.15799492 +0000 UTC m=+3684.398504141" lastFinishedPulling="2025-11-29 06:29:22.808394119 +0000 UTC m=+3687.048903340" observedRunningTime="2025-11-29 06:29:23.210006865 +0000 UTC m=+3687.450516086" watchObservedRunningTime="2025-11-29 06:29:23.217622022 +0000 UTC m=+3687.458131242" Nov 29 06:29:29 crc kubenswrapper[4594]: I1129 06:29:29.297863 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:29 crc kubenswrapper[4594]: I1129 06:29:29.298638 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:29 crc kubenswrapper[4594]: I1129 06:29:29.345599 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:30 crc kubenswrapper[4594]: I1129 06:29:30.312681 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:30 crc kubenswrapper[4594]: I1129 06:29:30.366015 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hw86"] Nov 29 06:29:32 crc kubenswrapper[4594]: I1129 06:29:32.303717 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4hw86" podUID="0275fda6-178e-4685-a802-665793283eb2" containerName="registry-server" containerID="cri-o://8b4ef09160f4be08882b7c1882546eae6f8606b87a2899eeb16fe80a36f70f0b" gracePeriod=2 Nov 29 06:29:32 crc kubenswrapper[4594]: I1129 06:29:32.767145 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:32 crc kubenswrapper[4594]: I1129 06:29:32.876234 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0275fda6-178e-4685-a802-665793283eb2-catalog-content\") pod \"0275fda6-178e-4685-a802-665793283eb2\" (UID: \"0275fda6-178e-4685-a802-665793283eb2\") " Nov 29 06:29:32 crc kubenswrapper[4594]: I1129 06:29:32.876367 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0275fda6-178e-4685-a802-665793283eb2-utilities\") pod \"0275fda6-178e-4685-a802-665793283eb2\" (UID: \"0275fda6-178e-4685-a802-665793283eb2\") " Nov 29 06:29:32 crc kubenswrapper[4594]: I1129 06:29:32.876464 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gcb9\" (UniqueName: \"kubernetes.io/projected/0275fda6-178e-4685-a802-665793283eb2-kube-api-access-7gcb9\") pod \"0275fda6-178e-4685-a802-665793283eb2\" (UID: \"0275fda6-178e-4685-a802-665793283eb2\") " Nov 29 06:29:32 crc kubenswrapper[4594]: I1129 06:29:32.877096 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0275fda6-178e-4685-a802-665793283eb2-utilities" (OuterVolumeSpecName: "utilities") pod "0275fda6-178e-4685-a802-665793283eb2" (UID: "0275fda6-178e-4685-a802-665793283eb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:29:32 crc kubenswrapper[4594]: I1129 06:29:32.882016 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0275fda6-178e-4685-a802-665793283eb2-kube-api-access-7gcb9" (OuterVolumeSpecName: "kube-api-access-7gcb9") pod "0275fda6-178e-4685-a802-665793283eb2" (UID: "0275fda6-178e-4685-a802-665793283eb2"). InnerVolumeSpecName "kube-api-access-7gcb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:29:32 crc kubenswrapper[4594]: I1129 06:29:32.917461 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0275fda6-178e-4685-a802-665793283eb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0275fda6-178e-4685-a802-665793283eb2" (UID: "0275fda6-178e-4685-a802-665793283eb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:29:32 crc kubenswrapper[4594]: I1129 06:29:32.978460 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0275fda6-178e-4685-a802-665793283eb2-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:29:32 crc kubenswrapper[4594]: I1129 06:29:32.978505 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gcb9\" (UniqueName: \"kubernetes.io/projected/0275fda6-178e-4685-a802-665793283eb2-kube-api-access-7gcb9\") on node \"crc\" DevicePath \"\"" Nov 29 06:29:32 crc kubenswrapper[4594]: I1129 06:29:32.978527 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0275fda6-178e-4685-a802-665793283eb2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.316277 4594 generic.go:334] "Generic (PLEG): container finished" podID="0275fda6-178e-4685-a802-665793283eb2" containerID="8b4ef09160f4be08882b7c1882546eae6f8606b87a2899eeb16fe80a36f70f0b" exitCode=0 Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.316350 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hw86" event={"ID":"0275fda6-178e-4685-a802-665793283eb2","Type":"ContainerDied","Data":"8b4ef09160f4be08882b7c1882546eae6f8606b87a2899eeb16fe80a36f70f0b"} Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.316369 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hw86" Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.316401 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hw86" event={"ID":"0275fda6-178e-4685-a802-665793283eb2","Type":"ContainerDied","Data":"8550c5aa739290911c29ed1c85db4f4b08afd693f1d45bd1f55e87b51b50fa1d"} Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.316431 4594 scope.go:117] "RemoveContainer" containerID="8b4ef09160f4be08882b7c1882546eae6f8606b87a2899eeb16fe80a36f70f0b" Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.344906 4594 scope.go:117] "RemoveContainer" containerID="59ee939056412acc83d479608c166fbafd737e1283cf2344f2bb982e9d37cb9a" Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.347011 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hw86"] Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.356336 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4hw86"] Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.375438 4594 scope.go:117] "RemoveContainer" containerID="f82f06d599152a68b092d503f8caf68ecdd196be25aed5c31900ecc703dfe14f" Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.402179 4594 scope.go:117] "RemoveContainer" containerID="8b4ef09160f4be08882b7c1882546eae6f8606b87a2899eeb16fe80a36f70f0b" Nov 29 06:29:33 crc kubenswrapper[4594]: E1129 06:29:33.402731 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b4ef09160f4be08882b7c1882546eae6f8606b87a2899eeb16fe80a36f70f0b\": container with ID starting with 8b4ef09160f4be08882b7c1882546eae6f8606b87a2899eeb16fe80a36f70f0b not found: ID does not exist" containerID="8b4ef09160f4be08882b7c1882546eae6f8606b87a2899eeb16fe80a36f70f0b" Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.402783 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b4ef09160f4be08882b7c1882546eae6f8606b87a2899eeb16fe80a36f70f0b"} err="failed to get container status \"8b4ef09160f4be08882b7c1882546eae6f8606b87a2899eeb16fe80a36f70f0b\": rpc error: code = NotFound desc = could not find container \"8b4ef09160f4be08882b7c1882546eae6f8606b87a2899eeb16fe80a36f70f0b\": container with ID starting with 8b4ef09160f4be08882b7c1882546eae6f8606b87a2899eeb16fe80a36f70f0b not found: ID does not exist" Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.402818 4594 scope.go:117] "RemoveContainer" containerID="59ee939056412acc83d479608c166fbafd737e1283cf2344f2bb982e9d37cb9a" Nov 29 06:29:33 crc kubenswrapper[4594]: E1129 06:29:33.403207 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ee939056412acc83d479608c166fbafd737e1283cf2344f2bb982e9d37cb9a\": container with ID starting with 59ee939056412acc83d479608c166fbafd737e1283cf2344f2bb982e9d37cb9a not found: ID does not exist" containerID="59ee939056412acc83d479608c166fbafd737e1283cf2344f2bb982e9d37cb9a" Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.403248 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ee939056412acc83d479608c166fbafd737e1283cf2344f2bb982e9d37cb9a"} err="failed to get container status \"59ee939056412acc83d479608c166fbafd737e1283cf2344f2bb982e9d37cb9a\": rpc error: code = NotFound desc = could not find container \"59ee939056412acc83d479608c166fbafd737e1283cf2344f2bb982e9d37cb9a\": container with ID starting with 59ee939056412acc83d479608c166fbafd737e1283cf2344f2bb982e9d37cb9a not found: ID does not exist" Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.403291 4594 scope.go:117] "RemoveContainer" containerID="f82f06d599152a68b092d503f8caf68ecdd196be25aed5c31900ecc703dfe14f" Nov 29 06:29:33 crc kubenswrapper[4594]: E1129 06:29:33.403611 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f82f06d599152a68b092d503f8caf68ecdd196be25aed5c31900ecc703dfe14f\": container with ID starting with f82f06d599152a68b092d503f8caf68ecdd196be25aed5c31900ecc703dfe14f not found: ID does not exist" containerID="f82f06d599152a68b092d503f8caf68ecdd196be25aed5c31900ecc703dfe14f" Nov 29 06:29:33 crc kubenswrapper[4594]: I1129 06:29:33.403641 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f82f06d599152a68b092d503f8caf68ecdd196be25aed5c31900ecc703dfe14f"} err="failed to get container status \"f82f06d599152a68b092d503f8caf68ecdd196be25aed5c31900ecc703dfe14f\": rpc error: code = NotFound desc = could not find container \"f82f06d599152a68b092d503f8caf68ecdd196be25aed5c31900ecc703dfe14f\": container with ID starting with f82f06d599152a68b092d503f8caf68ecdd196be25aed5c31900ecc703dfe14f not found: ID does not exist" Nov 29 06:29:34 crc kubenswrapper[4594]: I1129 06:29:34.097380 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0275fda6-178e-4685-a802-665793283eb2" path="/var/lib/kubelet/pods/0275fda6-178e-4685-a802-665793283eb2/volumes" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.159978 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m"] Nov 29 06:30:00 crc kubenswrapper[4594]: E1129 06:30:00.161318 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0275fda6-178e-4685-a802-665793283eb2" containerName="extract-content" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.161339 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0275fda6-178e-4685-a802-665793283eb2" containerName="extract-content" Nov 29 06:30:00 crc kubenswrapper[4594]: E1129 06:30:00.161358 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0275fda6-178e-4685-a802-665793283eb2" containerName="registry-server" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.161366 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0275fda6-178e-4685-a802-665793283eb2" containerName="registry-server" Nov 29 06:30:00 crc kubenswrapper[4594]: E1129 06:30:00.161393 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0275fda6-178e-4685-a802-665793283eb2" containerName="extract-utilities" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.161401 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0275fda6-178e-4685-a802-665793283eb2" containerName="extract-utilities" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.161648 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="0275fda6-178e-4685-a802-665793283eb2" containerName="registry-server" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.162646 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.165025 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.170237 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.174218 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m"] Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.222927 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz76r\" (UniqueName: \"kubernetes.io/projected/14f992ea-a4f5-440c-9d1d-9e0548f7658d-kube-api-access-fz76r\") pod \"collect-profiles-29406630-7cp9m\" (UID: \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.223017 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14f992ea-a4f5-440c-9d1d-9e0548f7658d-secret-volume\") pod \"collect-profiles-29406630-7cp9m\" (UID: \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.223072 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14f992ea-a4f5-440c-9d1d-9e0548f7658d-config-volume\") pod \"collect-profiles-29406630-7cp9m\" (UID: \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.325814 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz76r\" (UniqueName: \"kubernetes.io/projected/14f992ea-a4f5-440c-9d1d-9e0548f7658d-kube-api-access-fz76r\") pod \"collect-profiles-29406630-7cp9m\" (UID: \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.325936 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14f992ea-a4f5-440c-9d1d-9e0548f7658d-secret-volume\") pod \"collect-profiles-29406630-7cp9m\" (UID: \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.325996 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14f992ea-a4f5-440c-9d1d-9e0548f7658d-config-volume\") pod \"collect-profiles-29406630-7cp9m\" (UID: \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.327101 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14f992ea-a4f5-440c-9d1d-9e0548f7658d-config-volume\") pod \"collect-profiles-29406630-7cp9m\" (UID: \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.511012 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14f992ea-a4f5-440c-9d1d-9e0548f7658d-secret-volume\") pod \"collect-profiles-29406630-7cp9m\" (UID: \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.512013 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz76r\" (UniqueName: \"kubernetes.io/projected/14f992ea-a4f5-440c-9d1d-9e0548f7658d-kube-api-access-fz76r\") pod \"collect-profiles-29406630-7cp9m\" (UID: \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" Nov 29 06:30:00 crc kubenswrapper[4594]: I1129 06:30:00.783995 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" Nov 29 06:30:01 crc kubenswrapper[4594]: I1129 06:30:01.217234 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m"] Nov 29 06:30:01 crc kubenswrapper[4594]: I1129 06:30:01.625967 4594 generic.go:334] "Generic (PLEG): container finished" podID="14f992ea-a4f5-440c-9d1d-9e0548f7658d" containerID="65a3f6d0b73cb221ab4a0b9c70216275c5b60871ff80d4e3a6183f3bfe37668e" exitCode=0 Nov 29 06:30:01 crc kubenswrapper[4594]: I1129 06:30:01.626081 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" event={"ID":"14f992ea-a4f5-440c-9d1d-9e0548f7658d","Type":"ContainerDied","Data":"65a3f6d0b73cb221ab4a0b9c70216275c5b60871ff80d4e3a6183f3bfe37668e"} Nov 29 06:30:01 crc kubenswrapper[4594]: I1129 06:30:01.626210 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" event={"ID":"14f992ea-a4f5-440c-9d1d-9e0548f7658d","Type":"ContainerStarted","Data":"0b3dd464e015bdca02ee2b091b25d50f2cbae4fe00fa037427beb1e9ccf79f85"} Nov 29 06:30:02 crc kubenswrapper[4594]: I1129 06:30:02.949635 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" Nov 29 06:30:03 crc kubenswrapper[4594]: I1129 06:30:03.104941 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14f992ea-a4f5-440c-9d1d-9e0548f7658d-config-volume\") pod \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\" (UID: \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\") " Nov 29 06:30:03 crc kubenswrapper[4594]: I1129 06:30:03.105215 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz76r\" (UniqueName: \"kubernetes.io/projected/14f992ea-a4f5-440c-9d1d-9e0548f7658d-kube-api-access-fz76r\") pod \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\" (UID: \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\") " Nov 29 06:30:03 crc kubenswrapper[4594]: I1129 06:30:03.105338 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14f992ea-a4f5-440c-9d1d-9e0548f7658d-secret-volume\") pod \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\" (UID: \"14f992ea-a4f5-440c-9d1d-9e0548f7658d\") " Nov 29 06:30:03 crc kubenswrapper[4594]: I1129 06:30:03.106323 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f992ea-a4f5-440c-9d1d-9e0548f7658d-config-volume" (OuterVolumeSpecName: "config-volume") pod "14f992ea-a4f5-440c-9d1d-9e0548f7658d" (UID: "14f992ea-a4f5-440c-9d1d-9e0548f7658d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 06:30:03 crc kubenswrapper[4594]: I1129 06:30:03.112831 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14f992ea-a4f5-440c-9d1d-9e0548f7658d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "14f992ea-a4f5-440c-9d1d-9e0548f7658d" (UID: "14f992ea-a4f5-440c-9d1d-9e0548f7658d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:30:03 crc kubenswrapper[4594]: I1129 06:30:03.113288 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14f992ea-a4f5-440c-9d1d-9e0548f7658d-kube-api-access-fz76r" (OuterVolumeSpecName: "kube-api-access-fz76r") pod "14f992ea-a4f5-440c-9d1d-9e0548f7658d" (UID: "14f992ea-a4f5-440c-9d1d-9e0548f7658d"). InnerVolumeSpecName "kube-api-access-fz76r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:30:03 crc kubenswrapper[4594]: I1129 06:30:03.209012 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz76r\" (UniqueName: \"kubernetes.io/projected/14f992ea-a4f5-440c-9d1d-9e0548f7658d-kube-api-access-fz76r\") on node \"crc\" DevicePath \"\"" Nov 29 06:30:03 crc kubenswrapper[4594]: I1129 06:30:03.209125 4594 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14f992ea-a4f5-440c-9d1d-9e0548f7658d-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 06:30:03 crc kubenswrapper[4594]: I1129 06:30:03.209137 4594 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14f992ea-a4f5-440c-9d1d-9e0548f7658d-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 06:30:03 crc kubenswrapper[4594]: I1129 06:30:03.649590 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" event={"ID":"14f992ea-a4f5-440c-9d1d-9e0548f7658d","Type":"ContainerDied","Data":"0b3dd464e015bdca02ee2b091b25d50f2cbae4fe00fa037427beb1e9ccf79f85"} Nov 29 06:30:03 crc kubenswrapper[4594]: I1129 06:30:03.649643 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b3dd464e015bdca02ee2b091b25d50f2cbae4fe00fa037427beb1e9ccf79f85" Nov 29 06:30:03 crc kubenswrapper[4594]: I1129 06:30:03.649672 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406630-7cp9m" Nov 29 06:30:04 crc kubenswrapper[4594]: I1129 06:30:04.012445 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2"] Nov 29 06:30:04 crc kubenswrapper[4594]: I1129 06:30:04.018393 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406585-kz8m2"] Nov 29 06:30:04 crc kubenswrapper[4594]: I1129 06:30:04.095310 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17adfde2-389b-491b-8c00-8293e37021b4" path="/var/lib/kubelet/pods/17adfde2-389b-491b-8c00-8293e37021b4/volumes" Nov 29 06:30:10 crc kubenswrapper[4594]: I1129 06:30:10.782760 4594 scope.go:117] "RemoveContainer" containerID="b2afaff1905126455942d8e0ed084ec1dc7230e82c53f586130ed351e00ec8ca" Nov 29 06:31:15 crc kubenswrapper[4594]: I1129 06:31:15.800228 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:31:15 crc kubenswrapper[4594]: I1129 06:31:15.800913 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:31:45 crc kubenswrapper[4594]: I1129 06:31:45.800597 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:31:45 crc kubenswrapper[4594]: I1129 06:31:45.801229 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:32:15 crc kubenswrapper[4594]: I1129 06:32:15.800140 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:32:15 crc kubenswrapper[4594]: I1129 06:32:15.800940 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:32:15 crc kubenswrapper[4594]: I1129 06:32:15.801006 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 06:32:15 crc kubenswrapper[4594]: I1129 06:32:15.802165 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 06:32:15 crc kubenswrapper[4594]: I1129 06:32:15.802240 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" gracePeriod=600 Nov 29 06:32:15 crc kubenswrapper[4594]: E1129 06:32:15.920563 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:32:15 crc kubenswrapper[4594]: I1129 06:32:15.960374 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" exitCode=0 Nov 29 06:32:15 crc kubenswrapper[4594]: I1129 06:32:15.960435 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3"} Nov 29 06:32:15 crc kubenswrapper[4594]: I1129 06:32:15.960480 4594 scope.go:117] "RemoveContainer" containerID="b53fc2e43ea71880847371ec8848d6b5ccb0f0cb3990cf3b77060a262ca641e5" Nov 29 06:32:15 crc kubenswrapper[4594]: I1129 06:32:15.961066 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:32:15 crc kubenswrapper[4594]: E1129 06:32:15.961436 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:32:27 crc kubenswrapper[4594]: I1129 06:32:27.083737 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:32:27 crc kubenswrapper[4594]: E1129 06:32:27.084950 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:32:41 crc kubenswrapper[4594]: I1129 06:32:41.083853 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:32:41 crc kubenswrapper[4594]: E1129 06:32:41.084662 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.539450 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dwvmh"] Nov 29 06:32:43 crc kubenswrapper[4594]: E1129 06:32:43.540318 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14f992ea-a4f5-440c-9d1d-9e0548f7658d" containerName="collect-profiles" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.540331 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="14f992ea-a4f5-440c-9d1d-9e0548f7658d" containerName="collect-profiles" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.540517 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="14f992ea-a4f5-440c-9d1d-9e0548f7658d" containerName="collect-profiles" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.541765 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.550314 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dwvmh"] Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.598045 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsz7v\" (UniqueName: \"kubernetes.io/projected/17d4400f-ba47-4569-b7fb-a6103999082b-kube-api-access-rsz7v\") pod \"redhat-operators-dwvmh\" (UID: \"17d4400f-ba47-4569-b7fb-a6103999082b\") " pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.598224 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d4400f-ba47-4569-b7fb-a6103999082b-catalog-content\") pod \"redhat-operators-dwvmh\" (UID: \"17d4400f-ba47-4569-b7fb-a6103999082b\") " pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.598246 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d4400f-ba47-4569-b7fb-a6103999082b-utilities\") pod \"redhat-operators-dwvmh\" (UID: \"17d4400f-ba47-4569-b7fb-a6103999082b\") " pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.700814 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsz7v\" (UniqueName: \"kubernetes.io/projected/17d4400f-ba47-4569-b7fb-a6103999082b-kube-api-access-rsz7v\") pod \"redhat-operators-dwvmh\" (UID: \"17d4400f-ba47-4569-b7fb-a6103999082b\") " pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.701090 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d4400f-ba47-4569-b7fb-a6103999082b-catalog-content\") pod \"redhat-operators-dwvmh\" (UID: \"17d4400f-ba47-4569-b7fb-a6103999082b\") " pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.701120 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d4400f-ba47-4569-b7fb-a6103999082b-utilities\") pod \"redhat-operators-dwvmh\" (UID: \"17d4400f-ba47-4569-b7fb-a6103999082b\") " pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.701630 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d4400f-ba47-4569-b7fb-a6103999082b-utilities\") pod \"redhat-operators-dwvmh\" (UID: \"17d4400f-ba47-4569-b7fb-a6103999082b\") " pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.701697 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d4400f-ba47-4569-b7fb-a6103999082b-catalog-content\") pod \"redhat-operators-dwvmh\" (UID: \"17d4400f-ba47-4569-b7fb-a6103999082b\") " pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.720643 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsz7v\" (UniqueName: \"kubernetes.io/projected/17d4400f-ba47-4569-b7fb-a6103999082b-kube-api-access-rsz7v\") pod \"redhat-operators-dwvmh\" (UID: \"17d4400f-ba47-4569-b7fb-a6103999082b\") " pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:32:43 crc kubenswrapper[4594]: I1129 06:32:43.858411 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:32:44 crc kubenswrapper[4594]: I1129 06:32:44.279855 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dwvmh"] Nov 29 06:32:45 crc kubenswrapper[4594]: I1129 06:32:45.235650 4594 generic.go:334] "Generic (PLEG): container finished" podID="17d4400f-ba47-4569-b7fb-a6103999082b" containerID="4a38e00eae5e017a791500f390524a72401c8ef4871860619c57357e4cd80530" exitCode=0 Nov 29 06:32:45 crc kubenswrapper[4594]: I1129 06:32:45.235779 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwvmh" event={"ID":"17d4400f-ba47-4569-b7fb-a6103999082b","Type":"ContainerDied","Data":"4a38e00eae5e017a791500f390524a72401c8ef4871860619c57357e4cd80530"} Nov 29 06:32:45 crc kubenswrapper[4594]: I1129 06:32:45.236065 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwvmh" event={"ID":"17d4400f-ba47-4569-b7fb-a6103999082b","Type":"ContainerStarted","Data":"49c89679900e96b10baef8e2980edef0ff2bac35ccd4f167e111217cb1229517"} Nov 29 06:32:45 crc kubenswrapper[4594]: I1129 06:32:45.238372 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 06:32:46 crc kubenswrapper[4594]: I1129 06:32:46.249176 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwvmh" event={"ID":"17d4400f-ba47-4569-b7fb-a6103999082b","Type":"ContainerStarted","Data":"300313c65357d8443a853f2c9a192f4965ad74a9e94620074ba9d3ef6b492dea"} Nov 29 06:32:48 crc kubenswrapper[4594]: I1129 06:32:48.268151 4594 generic.go:334] "Generic (PLEG): container finished" podID="17d4400f-ba47-4569-b7fb-a6103999082b" containerID="300313c65357d8443a853f2c9a192f4965ad74a9e94620074ba9d3ef6b492dea" exitCode=0 Nov 29 06:32:48 crc kubenswrapper[4594]: I1129 06:32:48.268241 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwvmh" event={"ID":"17d4400f-ba47-4569-b7fb-a6103999082b","Type":"ContainerDied","Data":"300313c65357d8443a853f2c9a192f4965ad74a9e94620074ba9d3ef6b492dea"} Nov 29 06:32:48 crc kubenswrapper[4594]: I1129 06:32:48.752133 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-grr8v"] Nov 29 06:32:48 crc kubenswrapper[4594]: I1129 06:32:48.756325 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:48 crc kubenswrapper[4594]: I1129 06:32:48.773919 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-grr8v"] Nov 29 06:32:48 crc kubenswrapper[4594]: I1129 06:32:48.917891 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d45c650-437b-489f-a183-ab9b158d4b98-catalog-content\") pod \"redhat-marketplace-grr8v\" (UID: \"4d45c650-437b-489f-a183-ab9b158d4b98\") " pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:48 crc kubenswrapper[4594]: I1129 06:32:48.917985 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d45c650-437b-489f-a183-ab9b158d4b98-utilities\") pod \"redhat-marketplace-grr8v\" (UID: \"4d45c650-437b-489f-a183-ab9b158d4b98\") " pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:48 crc kubenswrapper[4594]: I1129 06:32:48.918146 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7rws\" (UniqueName: \"kubernetes.io/projected/4d45c650-437b-489f-a183-ab9b158d4b98-kube-api-access-f7rws\") pod \"redhat-marketplace-grr8v\" (UID: \"4d45c650-437b-489f-a183-ab9b158d4b98\") " pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:49 crc kubenswrapper[4594]: I1129 06:32:49.020171 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d45c650-437b-489f-a183-ab9b158d4b98-catalog-content\") pod \"redhat-marketplace-grr8v\" (UID: \"4d45c650-437b-489f-a183-ab9b158d4b98\") " pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:49 crc kubenswrapper[4594]: I1129 06:32:49.020434 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d45c650-437b-489f-a183-ab9b158d4b98-utilities\") pod \"redhat-marketplace-grr8v\" (UID: \"4d45c650-437b-489f-a183-ab9b158d4b98\") " pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:49 crc kubenswrapper[4594]: I1129 06:32:49.020580 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7rws\" (UniqueName: \"kubernetes.io/projected/4d45c650-437b-489f-a183-ab9b158d4b98-kube-api-access-f7rws\") pod \"redhat-marketplace-grr8v\" (UID: \"4d45c650-437b-489f-a183-ab9b158d4b98\") " pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:49 crc kubenswrapper[4594]: I1129 06:32:49.020755 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d45c650-437b-489f-a183-ab9b158d4b98-catalog-content\") pod \"redhat-marketplace-grr8v\" (UID: \"4d45c650-437b-489f-a183-ab9b158d4b98\") " pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:49 crc kubenswrapper[4594]: I1129 06:32:49.021017 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d45c650-437b-489f-a183-ab9b158d4b98-utilities\") pod \"redhat-marketplace-grr8v\" (UID: \"4d45c650-437b-489f-a183-ab9b158d4b98\") " pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:49 crc kubenswrapper[4594]: I1129 06:32:49.038809 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7rws\" (UniqueName: \"kubernetes.io/projected/4d45c650-437b-489f-a183-ab9b158d4b98-kube-api-access-f7rws\") pod \"redhat-marketplace-grr8v\" (UID: \"4d45c650-437b-489f-a183-ab9b158d4b98\") " pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:49 crc kubenswrapper[4594]: I1129 06:32:49.085513 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:49 crc kubenswrapper[4594]: I1129 06:32:49.293171 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwvmh" event={"ID":"17d4400f-ba47-4569-b7fb-a6103999082b","Type":"ContainerStarted","Data":"d924979672926ed9dd900047714b42f420a721aab14b10cb82561e1a4d6c2c72"} Nov 29 06:32:49 crc kubenswrapper[4594]: I1129 06:32:49.316104 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dwvmh" podStartSLOduration=2.6834461 podStartE2EDuration="6.316086758s" podCreationTimestamp="2025-11-29 06:32:43 +0000 UTC" firstStartedPulling="2025-11-29 06:32:45.238095238 +0000 UTC m=+3889.478604459" lastFinishedPulling="2025-11-29 06:32:48.870735896 +0000 UTC m=+3893.111245117" observedRunningTime="2025-11-29 06:32:49.307920526 +0000 UTC m=+3893.548429736" watchObservedRunningTime="2025-11-29 06:32:49.316086758 +0000 UTC m=+3893.556595978" Nov 29 06:32:49 crc kubenswrapper[4594]: I1129 06:32:49.529693 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-grr8v"] Nov 29 06:32:50 crc kubenswrapper[4594]: I1129 06:32:50.307006 4594 generic.go:334] "Generic (PLEG): container finished" podID="4d45c650-437b-489f-a183-ab9b158d4b98" containerID="3e055b848070d02bc28d3dcad819348409597d34147131085ffbe921f95e4359" exitCode=0 Nov 29 06:32:50 crc kubenswrapper[4594]: I1129 06:32:50.307067 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grr8v" event={"ID":"4d45c650-437b-489f-a183-ab9b158d4b98","Type":"ContainerDied","Data":"3e055b848070d02bc28d3dcad819348409597d34147131085ffbe921f95e4359"} Nov 29 06:32:50 crc kubenswrapper[4594]: I1129 06:32:50.307101 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grr8v" event={"ID":"4d45c650-437b-489f-a183-ab9b158d4b98","Type":"ContainerStarted","Data":"0305e91632ea3124a6e1c559de44cd8c65b47be27a88c6ae378e78822530116b"} Nov 29 06:32:51 crc kubenswrapper[4594]: I1129 06:32:51.316497 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grr8v" event={"ID":"4d45c650-437b-489f-a183-ab9b158d4b98","Type":"ContainerStarted","Data":"ff742a4db480370e1be7767e767e32acf50398b19c3ba7bbb56ccd441da103e7"} Nov 29 06:32:52 crc kubenswrapper[4594]: I1129 06:32:52.329763 4594 generic.go:334] "Generic (PLEG): container finished" podID="4d45c650-437b-489f-a183-ab9b158d4b98" containerID="ff742a4db480370e1be7767e767e32acf50398b19c3ba7bbb56ccd441da103e7" exitCode=0 Nov 29 06:32:52 crc kubenswrapper[4594]: I1129 06:32:52.329828 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grr8v" event={"ID":"4d45c650-437b-489f-a183-ab9b158d4b98","Type":"ContainerDied","Data":"ff742a4db480370e1be7767e767e32acf50398b19c3ba7bbb56ccd441da103e7"} Nov 29 06:32:53 crc kubenswrapper[4594]: I1129 06:32:53.341157 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grr8v" event={"ID":"4d45c650-437b-489f-a183-ab9b158d4b98","Type":"ContainerStarted","Data":"f245c3d09591640705b01935b02547109d1b1c1c8b70f5cfb690fd0a70efb934"} Nov 29 06:32:53 crc kubenswrapper[4594]: I1129 06:32:53.359705 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-grr8v" podStartSLOduration=2.796404418 podStartE2EDuration="5.359688517s" podCreationTimestamp="2025-11-29 06:32:48 +0000 UTC" firstStartedPulling="2025-11-29 06:32:50.309089879 +0000 UTC m=+3894.549599099" lastFinishedPulling="2025-11-29 06:32:52.872373978 +0000 UTC m=+3897.112883198" observedRunningTime="2025-11-29 06:32:53.354092057 +0000 UTC m=+3897.594601276" watchObservedRunningTime="2025-11-29 06:32:53.359688517 +0000 UTC m=+3897.600197738" Nov 29 06:32:53 crc kubenswrapper[4594]: I1129 06:32:53.859300 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:32:53 crc kubenswrapper[4594]: I1129 06:32:53.859376 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:32:54 crc kubenswrapper[4594]: I1129 06:32:54.896826 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dwvmh" podUID="17d4400f-ba47-4569-b7fb-a6103999082b" containerName="registry-server" probeResult="failure" output=< Nov 29 06:32:54 crc kubenswrapper[4594]: timeout: failed to connect service ":50051" within 1s Nov 29 06:32:54 crc kubenswrapper[4594]: > Nov 29 06:32:56 crc kubenswrapper[4594]: I1129 06:32:56.090029 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:32:56 crc kubenswrapper[4594]: E1129 06:32:56.090337 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:32:59 crc kubenswrapper[4594]: I1129 06:32:59.085979 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:59 crc kubenswrapper[4594]: I1129 06:32:59.086743 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:59 crc kubenswrapper[4594]: I1129 06:32:59.132145 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:59 crc kubenswrapper[4594]: I1129 06:32:59.452249 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:32:59 crc kubenswrapper[4594]: I1129 06:32:59.490337 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-grr8v"] Nov 29 06:33:01 crc kubenswrapper[4594]: I1129 06:33:01.433098 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-grr8v" podUID="4d45c650-437b-489f-a183-ab9b158d4b98" containerName="registry-server" containerID="cri-o://f245c3d09591640705b01935b02547109d1b1c1c8b70f5cfb690fd0a70efb934" gracePeriod=2 Nov 29 06:33:01 crc kubenswrapper[4594]: I1129 06:33:01.855835 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:33:01 crc kubenswrapper[4594]: I1129 06:33:01.938216 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7rws\" (UniqueName: \"kubernetes.io/projected/4d45c650-437b-489f-a183-ab9b158d4b98-kube-api-access-f7rws\") pod \"4d45c650-437b-489f-a183-ab9b158d4b98\" (UID: \"4d45c650-437b-489f-a183-ab9b158d4b98\") " Nov 29 06:33:01 crc kubenswrapper[4594]: I1129 06:33:01.938296 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d45c650-437b-489f-a183-ab9b158d4b98-utilities\") pod \"4d45c650-437b-489f-a183-ab9b158d4b98\" (UID: \"4d45c650-437b-489f-a183-ab9b158d4b98\") " Nov 29 06:33:01 crc kubenswrapper[4594]: I1129 06:33:01.938350 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d45c650-437b-489f-a183-ab9b158d4b98-catalog-content\") pod \"4d45c650-437b-489f-a183-ab9b158d4b98\" (UID: \"4d45c650-437b-489f-a183-ab9b158d4b98\") " Nov 29 06:33:01 crc kubenswrapper[4594]: I1129 06:33:01.938805 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d45c650-437b-489f-a183-ab9b158d4b98-utilities" (OuterVolumeSpecName: "utilities") pod "4d45c650-437b-489f-a183-ab9b158d4b98" (UID: "4d45c650-437b-489f-a183-ab9b158d4b98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:33:01 crc kubenswrapper[4594]: I1129 06:33:01.939158 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d45c650-437b-489f-a183-ab9b158d4b98-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:33:01 crc kubenswrapper[4594]: I1129 06:33:01.945495 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d45c650-437b-489f-a183-ab9b158d4b98-kube-api-access-f7rws" (OuterVolumeSpecName: "kube-api-access-f7rws") pod "4d45c650-437b-489f-a183-ab9b158d4b98" (UID: "4d45c650-437b-489f-a183-ab9b158d4b98"). InnerVolumeSpecName "kube-api-access-f7rws". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:33:01 crc kubenswrapper[4594]: I1129 06:33:01.955555 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d45c650-437b-489f-a183-ab9b158d4b98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d45c650-437b-489f-a183-ab9b158d4b98" (UID: "4d45c650-437b-489f-a183-ab9b158d4b98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.041604 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7rws\" (UniqueName: \"kubernetes.io/projected/4d45c650-437b-489f-a183-ab9b158d4b98-kube-api-access-f7rws\") on node \"crc\" DevicePath \"\"" Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.041640 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d45c650-437b-489f-a183-ab9b158d4b98-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.446303 4594 generic.go:334] "Generic (PLEG): container finished" podID="4d45c650-437b-489f-a183-ab9b158d4b98" containerID="f245c3d09591640705b01935b02547109d1b1c1c8b70f5cfb690fd0a70efb934" exitCode=0 Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.446378 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grr8v" event={"ID":"4d45c650-437b-489f-a183-ab9b158d4b98","Type":"ContainerDied","Data":"f245c3d09591640705b01935b02547109d1b1c1c8b70f5cfb690fd0a70efb934"} Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.446724 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grr8v" event={"ID":"4d45c650-437b-489f-a183-ab9b158d4b98","Type":"ContainerDied","Data":"0305e91632ea3124a6e1c559de44cd8c65b47be27a88c6ae378e78822530116b"} Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.446758 4594 scope.go:117] "RemoveContainer" containerID="f245c3d09591640705b01935b02547109d1b1c1c8b70f5cfb690fd0a70efb934" Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.446450 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-grr8v" Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.470502 4594 scope.go:117] "RemoveContainer" containerID="ff742a4db480370e1be7767e767e32acf50398b19c3ba7bbb56ccd441da103e7" Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.476322 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-grr8v"] Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.484016 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-grr8v"] Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.489275 4594 scope.go:117] "RemoveContainer" containerID="3e055b848070d02bc28d3dcad819348409597d34147131085ffbe921f95e4359" Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.530432 4594 scope.go:117] "RemoveContainer" containerID="f245c3d09591640705b01935b02547109d1b1c1c8b70f5cfb690fd0a70efb934" Nov 29 06:33:02 crc kubenswrapper[4594]: E1129 06:33:02.530882 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f245c3d09591640705b01935b02547109d1b1c1c8b70f5cfb690fd0a70efb934\": container with ID starting with f245c3d09591640705b01935b02547109d1b1c1c8b70f5cfb690fd0a70efb934 not found: ID does not exist" containerID="f245c3d09591640705b01935b02547109d1b1c1c8b70f5cfb690fd0a70efb934" Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.530934 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f245c3d09591640705b01935b02547109d1b1c1c8b70f5cfb690fd0a70efb934"} err="failed to get container status \"f245c3d09591640705b01935b02547109d1b1c1c8b70f5cfb690fd0a70efb934\": rpc error: code = NotFound desc = could not find container \"f245c3d09591640705b01935b02547109d1b1c1c8b70f5cfb690fd0a70efb934\": container with ID starting with f245c3d09591640705b01935b02547109d1b1c1c8b70f5cfb690fd0a70efb934 not found: ID does not exist" Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.530969 4594 scope.go:117] "RemoveContainer" containerID="ff742a4db480370e1be7767e767e32acf50398b19c3ba7bbb56ccd441da103e7" Nov 29 06:33:02 crc kubenswrapper[4594]: E1129 06:33:02.531361 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff742a4db480370e1be7767e767e32acf50398b19c3ba7bbb56ccd441da103e7\": container with ID starting with ff742a4db480370e1be7767e767e32acf50398b19c3ba7bbb56ccd441da103e7 not found: ID does not exist" containerID="ff742a4db480370e1be7767e767e32acf50398b19c3ba7bbb56ccd441da103e7" Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.531401 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff742a4db480370e1be7767e767e32acf50398b19c3ba7bbb56ccd441da103e7"} err="failed to get container status \"ff742a4db480370e1be7767e767e32acf50398b19c3ba7bbb56ccd441da103e7\": rpc error: code = NotFound desc = could not find container \"ff742a4db480370e1be7767e767e32acf50398b19c3ba7bbb56ccd441da103e7\": container with ID starting with ff742a4db480370e1be7767e767e32acf50398b19c3ba7bbb56ccd441da103e7 not found: ID does not exist" Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.531429 4594 scope.go:117] "RemoveContainer" containerID="3e055b848070d02bc28d3dcad819348409597d34147131085ffbe921f95e4359" Nov 29 06:33:02 crc kubenswrapper[4594]: E1129 06:33:02.531697 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e055b848070d02bc28d3dcad819348409597d34147131085ffbe921f95e4359\": container with ID starting with 3e055b848070d02bc28d3dcad819348409597d34147131085ffbe921f95e4359 not found: ID does not exist" containerID="3e055b848070d02bc28d3dcad819348409597d34147131085ffbe921f95e4359" Nov 29 06:33:02 crc kubenswrapper[4594]: I1129 06:33:02.531730 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e055b848070d02bc28d3dcad819348409597d34147131085ffbe921f95e4359"} err="failed to get container status \"3e055b848070d02bc28d3dcad819348409597d34147131085ffbe921f95e4359\": rpc error: code = NotFound desc = could not find container \"3e055b848070d02bc28d3dcad819348409597d34147131085ffbe921f95e4359\": container with ID starting with 3e055b848070d02bc28d3dcad819348409597d34147131085ffbe921f95e4359 not found: ID does not exist" Nov 29 06:33:03 crc kubenswrapper[4594]: I1129 06:33:03.898147 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:33:03 crc kubenswrapper[4594]: I1129 06:33:03.937297 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:33:04 crc kubenswrapper[4594]: I1129 06:33:04.094875 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d45c650-437b-489f-a183-ab9b158d4b98" path="/var/lib/kubelet/pods/4d45c650-437b-489f-a183-ab9b158d4b98/volumes" Nov 29 06:33:04 crc kubenswrapper[4594]: I1129 06:33:04.769693 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dwvmh"] Nov 29 06:33:05 crc kubenswrapper[4594]: I1129 06:33:05.476618 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dwvmh" podUID="17d4400f-ba47-4569-b7fb-a6103999082b" containerName="registry-server" containerID="cri-o://d924979672926ed9dd900047714b42f420a721aab14b10cb82561e1a4d6c2c72" gracePeriod=2 Nov 29 06:33:05 crc kubenswrapper[4594]: I1129 06:33:05.916786 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.018444 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsz7v\" (UniqueName: \"kubernetes.io/projected/17d4400f-ba47-4569-b7fb-a6103999082b-kube-api-access-rsz7v\") pod \"17d4400f-ba47-4569-b7fb-a6103999082b\" (UID: \"17d4400f-ba47-4569-b7fb-a6103999082b\") " Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.018743 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d4400f-ba47-4569-b7fb-a6103999082b-catalog-content\") pod \"17d4400f-ba47-4569-b7fb-a6103999082b\" (UID: \"17d4400f-ba47-4569-b7fb-a6103999082b\") " Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.018783 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d4400f-ba47-4569-b7fb-a6103999082b-utilities\") pod \"17d4400f-ba47-4569-b7fb-a6103999082b\" (UID: \"17d4400f-ba47-4569-b7fb-a6103999082b\") " Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.019431 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17d4400f-ba47-4569-b7fb-a6103999082b-utilities" (OuterVolumeSpecName: "utilities") pod "17d4400f-ba47-4569-b7fb-a6103999082b" (UID: "17d4400f-ba47-4569-b7fb-a6103999082b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.023862 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17d4400f-ba47-4569-b7fb-a6103999082b-kube-api-access-rsz7v" (OuterVolumeSpecName: "kube-api-access-rsz7v") pod "17d4400f-ba47-4569-b7fb-a6103999082b" (UID: "17d4400f-ba47-4569-b7fb-a6103999082b"). InnerVolumeSpecName "kube-api-access-rsz7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.095838 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17d4400f-ba47-4569-b7fb-a6103999082b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17d4400f-ba47-4569-b7fb-a6103999082b" (UID: "17d4400f-ba47-4569-b7fb-a6103999082b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.121610 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsz7v\" (UniqueName: \"kubernetes.io/projected/17d4400f-ba47-4569-b7fb-a6103999082b-kube-api-access-rsz7v\") on node \"crc\" DevicePath \"\"" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.121639 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d4400f-ba47-4569-b7fb-a6103999082b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.121649 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d4400f-ba47-4569-b7fb-a6103999082b-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.493871 4594 generic.go:334] "Generic (PLEG): container finished" podID="17d4400f-ba47-4569-b7fb-a6103999082b" containerID="d924979672926ed9dd900047714b42f420a721aab14b10cb82561e1a4d6c2c72" exitCode=0 Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.493931 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwvmh" event={"ID":"17d4400f-ba47-4569-b7fb-a6103999082b","Type":"ContainerDied","Data":"d924979672926ed9dd900047714b42f420a721aab14b10cb82561e1a4d6c2c72"} Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.493970 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwvmh" event={"ID":"17d4400f-ba47-4569-b7fb-a6103999082b","Type":"ContainerDied","Data":"49c89679900e96b10baef8e2980edef0ff2bac35ccd4f167e111217cb1229517"} Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.494017 4594 scope.go:117] "RemoveContainer" containerID="d924979672926ed9dd900047714b42f420a721aab14b10cb82561e1a4d6c2c72" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.494570 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwvmh" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.516226 4594 scope.go:117] "RemoveContainer" containerID="300313c65357d8443a853f2c9a192f4965ad74a9e94620074ba9d3ef6b492dea" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.536022 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dwvmh"] Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.543695 4594 scope.go:117] "RemoveContainer" containerID="4a38e00eae5e017a791500f390524a72401c8ef4871860619c57357e4cd80530" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.544416 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dwvmh"] Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.597892 4594 scope.go:117] "RemoveContainer" containerID="d924979672926ed9dd900047714b42f420a721aab14b10cb82561e1a4d6c2c72" Nov 29 06:33:06 crc kubenswrapper[4594]: E1129 06:33:06.598354 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d924979672926ed9dd900047714b42f420a721aab14b10cb82561e1a4d6c2c72\": container with ID starting with d924979672926ed9dd900047714b42f420a721aab14b10cb82561e1a4d6c2c72 not found: ID does not exist" containerID="d924979672926ed9dd900047714b42f420a721aab14b10cb82561e1a4d6c2c72" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.598398 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d924979672926ed9dd900047714b42f420a721aab14b10cb82561e1a4d6c2c72"} err="failed to get container status \"d924979672926ed9dd900047714b42f420a721aab14b10cb82561e1a4d6c2c72\": rpc error: code = NotFound desc = could not find container \"d924979672926ed9dd900047714b42f420a721aab14b10cb82561e1a4d6c2c72\": container with ID starting with d924979672926ed9dd900047714b42f420a721aab14b10cb82561e1a4d6c2c72 not found: ID does not exist" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.598428 4594 scope.go:117] "RemoveContainer" containerID="300313c65357d8443a853f2c9a192f4965ad74a9e94620074ba9d3ef6b492dea" Nov 29 06:33:06 crc kubenswrapper[4594]: E1129 06:33:06.598859 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"300313c65357d8443a853f2c9a192f4965ad74a9e94620074ba9d3ef6b492dea\": container with ID starting with 300313c65357d8443a853f2c9a192f4965ad74a9e94620074ba9d3ef6b492dea not found: ID does not exist" containerID="300313c65357d8443a853f2c9a192f4965ad74a9e94620074ba9d3ef6b492dea" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.598900 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300313c65357d8443a853f2c9a192f4965ad74a9e94620074ba9d3ef6b492dea"} err="failed to get container status \"300313c65357d8443a853f2c9a192f4965ad74a9e94620074ba9d3ef6b492dea\": rpc error: code = NotFound desc = could not find container \"300313c65357d8443a853f2c9a192f4965ad74a9e94620074ba9d3ef6b492dea\": container with ID starting with 300313c65357d8443a853f2c9a192f4965ad74a9e94620074ba9d3ef6b492dea not found: ID does not exist" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.598930 4594 scope.go:117] "RemoveContainer" containerID="4a38e00eae5e017a791500f390524a72401c8ef4871860619c57357e4cd80530" Nov 29 06:33:06 crc kubenswrapper[4594]: E1129 06:33:06.599239 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a38e00eae5e017a791500f390524a72401c8ef4871860619c57357e4cd80530\": container with ID starting with 4a38e00eae5e017a791500f390524a72401c8ef4871860619c57357e4cd80530 not found: ID does not exist" containerID="4a38e00eae5e017a791500f390524a72401c8ef4871860619c57357e4cd80530" Nov 29 06:33:06 crc kubenswrapper[4594]: I1129 06:33:06.599284 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a38e00eae5e017a791500f390524a72401c8ef4871860619c57357e4cd80530"} err="failed to get container status \"4a38e00eae5e017a791500f390524a72401c8ef4871860619c57357e4cd80530\": rpc error: code = NotFound desc = could not find container \"4a38e00eae5e017a791500f390524a72401c8ef4871860619c57357e4cd80530\": container with ID starting with 4a38e00eae5e017a791500f390524a72401c8ef4871860619c57357e4cd80530 not found: ID does not exist" Nov 29 06:33:08 crc kubenswrapper[4594]: I1129 06:33:08.093138 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17d4400f-ba47-4569-b7fb-a6103999082b" path="/var/lib/kubelet/pods/17d4400f-ba47-4569-b7fb-a6103999082b/volumes" Nov 29 06:33:10 crc kubenswrapper[4594]: I1129 06:33:10.084244 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:33:10 crc kubenswrapper[4594]: E1129 06:33:10.084633 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:33:21 crc kubenswrapper[4594]: I1129 06:33:21.083649 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:33:21 crc kubenswrapper[4594]: E1129 06:33:21.084417 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:33:34 crc kubenswrapper[4594]: I1129 06:33:34.083916 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:33:34 crc kubenswrapper[4594]: E1129 06:33:34.084983 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:33:48 crc kubenswrapper[4594]: I1129 06:33:48.085021 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:33:48 crc kubenswrapper[4594]: E1129 06:33:48.086212 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:33:59 crc kubenswrapper[4594]: I1129 06:33:59.083471 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:33:59 crc kubenswrapper[4594]: E1129 06:33:59.084197 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:34:11 crc kubenswrapper[4594]: I1129 06:34:11.084794 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:34:11 crc kubenswrapper[4594]: E1129 06:34:11.085815 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:34:25 crc kubenswrapper[4594]: I1129 06:34:25.084834 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:34:25 crc kubenswrapper[4594]: E1129 06:34:25.086191 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:34:38 crc kubenswrapper[4594]: I1129 06:34:38.084867 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:34:38 crc kubenswrapper[4594]: E1129 06:34:38.086824 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:34:53 crc kubenswrapper[4594]: I1129 06:34:53.083564 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:34:53 crc kubenswrapper[4594]: E1129 06:34:53.084649 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:35:05 crc kubenswrapper[4594]: I1129 06:35:05.083560 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:35:05 crc kubenswrapper[4594]: E1129 06:35:05.084366 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:35:18 crc kubenswrapper[4594]: I1129 06:35:18.083573 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:35:18 crc kubenswrapper[4594]: E1129 06:35:18.084545 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:35:31 crc kubenswrapper[4594]: I1129 06:35:31.084295 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:35:31 crc kubenswrapper[4594]: E1129 06:35:31.085582 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:35:32 crc kubenswrapper[4594]: I1129 06:35:32.981204 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h7bnh"] Nov 29 06:35:32 crc kubenswrapper[4594]: E1129 06:35:32.981868 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17d4400f-ba47-4569-b7fb-a6103999082b" containerName="extract-content" Nov 29 06:35:32 crc kubenswrapper[4594]: I1129 06:35:32.981882 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d4400f-ba47-4569-b7fb-a6103999082b" containerName="extract-content" Nov 29 06:35:32 crc kubenswrapper[4594]: E1129 06:35:32.981891 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17d4400f-ba47-4569-b7fb-a6103999082b" containerName="extract-utilities" Nov 29 06:35:32 crc kubenswrapper[4594]: I1129 06:35:32.981897 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d4400f-ba47-4569-b7fb-a6103999082b" containerName="extract-utilities" Nov 29 06:35:32 crc kubenswrapper[4594]: E1129 06:35:32.981910 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17d4400f-ba47-4569-b7fb-a6103999082b" containerName="registry-server" Nov 29 06:35:32 crc kubenswrapper[4594]: I1129 06:35:32.981915 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d4400f-ba47-4569-b7fb-a6103999082b" containerName="registry-server" Nov 29 06:35:32 crc kubenswrapper[4594]: E1129 06:35:32.981934 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d45c650-437b-489f-a183-ab9b158d4b98" containerName="extract-content" Nov 29 06:35:32 crc kubenswrapper[4594]: I1129 06:35:32.981939 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d45c650-437b-489f-a183-ab9b158d4b98" containerName="extract-content" Nov 29 06:35:32 crc kubenswrapper[4594]: E1129 06:35:32.981947 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d45c650-437b-489f-a183-ab9b158d4b98" containerName="extract-utilities" Nov 29 06:35:32 crc kubenswrapper[4594]: I1129 06:35:32.981953 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d45c650-437b-489f-a183-ab9b158d4b98" containerName="extract-utilities" Nov 29 06:35:32 crc kubenswrapper[4594]: E1129 06:35:32.981975 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d45c650-437b-489f-a183-ab9b158d4b98" containerName="registry-server" Nov 29 06:35:32 crc kubenswrapper[4594]: I1129 06:35:32.981980 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d45c650-437b-489f-a183-ab9b158d4b98" containerName="registry-server" Nov 29 06:35:32 crc kubenswrapper[4594]: I1129 06:35:32.982146 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="17d4400f-ba47-4569-b7fb-a6103999082b" containerName="registry-server" Nov 29 06:35:32 crc kubenswrapper[4594]: I1129 06:35:32.982165 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d45c650-437b-489f-a183-ab9b158d4b98" containerName="registry-server" Nov 29 06:35:32 crc kubenswrapper[4594]: I1129 06:35:32.983495 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:32 crc kubenswrapper[4594]: I1129 06:35:32.992265 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h7bnh"] Nov 29 06:35:33 crc kubenswrapper[4594]: I1129 06:35:33.142670 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a6429ab-e885-4db9-934d-8d588f4dee25-catalog-content\") pod \"community-operators-h7bnh\" (UID: \"0a6429ab-e885-4db9-934d-8d588f4dee25\") " pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:33 crc kubenswrapper[4594]: I1129 06:35:33.142871 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rnkf\" (UniqueName: \"kubernetes.io/projected/0a6429ab-e885-4db9-934d-8d588f4dee25-kube-api-access-8rnkf\") pod \"community-operators-h7bnh\" (UID: \"0a6429ab-e885-4db9-934d-8d588f4dee25\") " pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:33 crc kubenswrapper[4594]: I1129 06:35:33.142932 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a6429ab-e885-4db9-934d-8d588f4dee25-utilities\") pod \"community-operators-h7bnh\" (UID: \"0a6429ab-e885-4db9-934d-8d588f4dee25\") " pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:33 crc kubenswrapper[4594]: I1129 06:35:33.245132 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rnkf\" (UniqueName: \"kubernetes.io/projected/0a6429ab-e885-4db9-934d-8d588f4dee25-kube-api-access-8rnkf\") pod \"community-operators-h7bnh\" (UID: \"0a6429ab-e885-4db9-934d-8d588f4dee25\") " pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:33 crc kubenswrapper[4594]: I1129 06:35:33.245227 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a6429ab-e885-4db9-934d-8d588f4dee25-utilities\") pod \"community-operators-h7bnh\" (UID: \"0a6429ab-e885-4db9-934d-8d588f4dee25\") " pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:33 crc kubenswrapper[4594]: I1129 06:35:33.245372 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a6429ab-e885-4db9-934d-8d588f4dee25-catalog-content\") pod \"community-operators-h7bnh\" (UID: \"0a6429ab-e885-4db9-934d-8d588f4dee25\") " pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:33 crc kubenswrapper[4594]: I1129 06:35:33.245842 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a6429ab-e885-4db9-934d-8d588f4dee25-utilities\") pod \"community-operators-h7bnh\" (UID: \"0a6429ab-e885-4db9-934d-8d588f4dee25\") " pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:33 crc kubenswrapper[4594]: I1129 06:35:33.245871 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a6429ab-e885-4db9-934d-8d588f4dee25-catalog-content\") pod \"community-operators-h7bnh\" (UID: \"0a6429ab-e885-4db9-934d-8d588f4dee25\") " pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:33 crc kubenswrapper[4594]: I1129 06:35:33.265942 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rnkf\" (UniqueName: \"kubernetes.io/projected/0a6429ab-e885-4db9-934d-8d588f4dee25-kube-api-access-8rnkf\") pod \"community-operators-h7bnh\" (UID: \"0a6429ab-e885-4db9-934d-8d588f4dee25\") " pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:33 crc kubenswrapper[4594]: I1129 06:35:33.300922 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:33 crc kubenswrapper[4594]: I1129 06:35:33.776396 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h7bnh"] Nov 29 06:35:33 crc kubenswrapper[4594]: I1129 06:35:33.907201 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7bnh" event={"ID":"0a6429ab-e885-4db9-934d-8d588f4dee25","Type":"ContainerStarted","Data":"128ae9ddfa5309bf414793c4dfdb1ebe89ef1e34f1a3c50e0c1a1b3d07c1bb96"} Nov 29 06:35:34 crc kubenswrapper[4594]: I1129 06:35:34.920080 4594 generic.go:334] "Generic (PLEG): container finished" podID="0a6429ab-e885-4db9-934d-8d588f4dee25" containerID="4a4f514bb127c3563188a340635ed66995ac84d74b1b04c87f039836b6f199d5" exitCode=0 Nov 29 06:35:34 crc kubenswrapper[4594]: I1129 06:35:34.920176 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7bnh" event={"ID":"0a6429ab-e885-4db9-934d-8d588f4dee25","Type":"ContainerDied","Data":"4a4f514bb127c3563188a340635ed66995ac84d74b1b04c87f039836b6f199d5"} Nov 29 06:35:36 crc kubenswrapper[4594]: I1129 06:35:36.938825 4594 generic.go:334] "Generic (PLEG): container finished" podID="0a6429ab-e885-4db9-934d-8d588f4dee25" containerID="8058fd5b80bd552757fa759035690c5dcf79e57e4d8b4fef791d9c265d404ea7" exitCode=0 Nov 29 06:35:36 crc kubenswrapper[4594]: I1129 06:35:36.938925 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7bnh" event={"ID":"0a6429ab-e885-4db9-934d-8d588f4dee25","Type":"ContainerDied","Data":"8058fd5b80bd552757fa759035690c5dcf79e57e4d8b4fef791d9c265d404ea7"} Nov 29 06:35:37 crc kubenswrapper[4594]: I1129 06:35:37.950709 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7bnh" event={"ID":"0a6429ab-e885-4db9-934d-8d588f4dee25","Type":"ContainerStarted","Data":"771ac5b0de73220b41429383a827b5cdde3baa64490ea135193b358d63e08799"} Nov 29 06:35:37 crc kubenswrapper[4594]: I1129 06:35:37.976046 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h7bnh" podStartSLOduration=3.455021174 podStartE2EDuration="5.976023625s" podCreationTimestamp="2025-11-29 06:35:32 +0000 UTC" firstStartedPulling="2025-11-29 06:35:34.921856342 +0000 UTC m=+4059.162365561" lastFinishedPulling="2025-11-29 06:35:37.442858791 +0000 UTC m=+4061.683368012" observedRunningTime="2025-11-29 06:35:37.965430702 +0000 UTC m=+4062.205939922" watchObservedRunningTime="2025-11-29 06:35:37.976023625 +0000 UTC m=+4062.216532845" Nov 29 06:35:42 crc kubenswrapper[4594]: I1129 06:35:42.083893 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:35:42 crc kubenswrapper[4594]: E1129 06:35:42.084508 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:35:43 crc kubenswrapper[4594]: I1129 06:35:43.301997 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:43 crc kubenswrapper[4594]: I1129 06:35:43.302461 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:43 crc kubenswrapper[4594]: I1129 06:35:43.344773 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:44 crc kubenswrapper[4594]: I1129 06:35:44.057680 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:44 crc kubenswrapper[4594]: I1129 06:35:44.110228 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h7bnh"] Nov 29 06:35:46 crc kubenswrapper[4594]: I1129 06:35:46.035833 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h7bnh" podUID="0a6429ab-e885-4db9-934d-8d588f4dee25" containerName="registry-server" containerID="cri-o://771ac5b0de73220b41429383a827b5cdde3baa64490ea135193b358d63e08799" gracePeriod=2 Nov 29 06:35:46 crc kubenswrapper[4594]: I1129 06:35:46.481163 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:46 crc kubenswrapper[4594]: I1129 06:35:46.665830 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a6429ab-e885-4db9-934d-8d588f4dee25-catalog-content\") pod \"0a6429ab-e885-4db9-934d-8d588f4dee25\" (UID: \"0a6429ab-e885-4db9-934d-8d588f4dee25\") " Nov 29 06:35:46 crc kubenswrapper[4594]: I1129 06:35:46.665887 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rnkf\" (UniqueName: \"kubernetes.io/projected/0a6429ab-e885-4db9-934d-8d588f4dee25-kube-api-access-8rnkf\") pod \"0a6429ab-e885-4db9-934d-8d588f4dee25\" (UID: \"0a6429ab-e885-4db9-934d-8d588f4dee25\") " Nov 29 06:35:46 crc kubenswrapper[4594]: I1129 06:35:46.666437 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a6429ab-e885-4db9-934d-8d588f4dee25-utilities\") pod \"0a6429ab-e885-4db9-934d-8d588f4dee25\" (UID: \"0a6429ab-e885-4db9-934d-8d588f4dee25\") " Nov 29 06:35:46 crc kubenswrapper[4594]: I1129 06:35:46.667167 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a6429ab-e885-4db9-934d-8d588f4dee25-utilities" (OuterVolumeSpecName: "utilities") pod "0a6429ab-e885-4db9-934d-8d588f4dee25" (UID: "0a6429ab-e885-4db9-934d-8d588f4dee25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:35:46 crc kubenswrapper[4594]: I1129 06:35:46.672770 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a6429ab-e885-4db9-934d-8d588f4dee25-kube-api-access-8rnkf" (OuterVolumeSpecName: "kube-api-access-8rnkf") pod "0a6429ab-e885-4db9-934d-8d588f4dee25" (UID: "0a6429ab-e885-4db9-934d-8d588f4dee25"). InnerVolumeSpecName "kube-api-access-8rnkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:35:46 crc kubenswrapper[4594]: I1129 06:35:46.705784 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a6429ab-e885-4db9-934d-8d588f4dee25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a6429ab-e885-4db9-934d-8d588f4dee25" (UID: "0a6429ab-e885-4db9-934d-8d588f4dee25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:35:46 crc kubenswrapper[4594]: I1129 06:35:46.767917 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a6429ab-e885-4db9-934d-8d588f4dee25-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:35:46 crc kubenswrapper[4594]: I1129 06:35:46.767942 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a6429ab-e885-4db9-934d-8d588f4dee25-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:35:46 crc kubenswrapper[4594]: I1129 06:35:46.767953 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rnkf\" (UniqueName: \"kubernetes.io/projected/0a6429ab-e885-4db9-934d-8d588f4dee25-kube-api-access-8rnkf\") on node \"crc\" DevicePath \"\"" Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.050028 4594 generic.go:334] "Generic (PLEG): container finished" podID="0a6429ab-e885-4db9-934d-8d588f4dee25" containerID="771ac5b0de73220b41429383a827b5cdde3baa64490ea135193b358d63e08799" exitCode=0 Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.050091 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7bnh" event={"ID":"0a6429ab-e885-4db9-934d-8d588f4dee25","Type":"ContainerDied","Data":"771ac5b0de73220b41429383a827b5cdde3baa64490ea135193b358d63e08799"} Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.050142 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7bnh" event={"ID":"0a6429ab-e885-4db9-934d-8d588f4dee25","Type":"ContainerDied","Data":"128ae9ddfa5309bf414793c4dfdb1ebe89ef1e34f1a3c50e0c1a1b3d07c1bb96"} Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.050164 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7bnh" Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.050167 4594 scope.go:117] "RemoveContainer" containerID="771ac5b0de73220b41429383a827b5cdde3baa64490ea135193b358d63e08799" Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.082186 4594 scope.go:117] "RemoveContainer" containerID="8058fd5b80bd552757fa759035690c5dcf79e57e4d8b4fef791d9c265d404ea7" Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.085306 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h7bnh"] Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.095199 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h7bnh"] Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.110679 4594 scope.go:117] "RemoveContainer" containerID="4a4f514bb127c3563188a340635ed66995ac84d74b1b04c87f039836b6f199d5" Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.141896 4594 scope.go:117] "RemoveContainer" containerID="771ac5b0de73220b41429383a827b5cdde3baa64490ea135193b358d63e08799" Nov 29 06:35:47 crc kubenswrapper[4594]: E1129 06:35:47.142280 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"771ac5b0de73220b41429383a827b5cdde3baa64490ea135193b358d63e08799\": container with ID starting with 771ac5b0de73220b41429383a827b5cdde3baa64490ea135193b358d63e08799 not found: ID does not exist" containerID="771ac5b0de73220b41429383a827b5cdde3baa64490ea135193b358d63e08799" Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.142326 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"771ac5b0de73220b41429383a827b5cdde3baa64490ea135193b358d63e08799"} err="failed to get container status \"771ac5b0de73220b41429383a827b5cdde3baa64490ea135193b358d63e08799\": rpc error: code = NotFound desc = could not find container \"771ac5b0de73220b41429383a827b5cdde3baa64490ea135193b358d63e08799\": container with ID starting with 771ac5b0de73220b41429383a827b5cdde3baa64490ea135193b358d63e08799 not found: ID does not exist" Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.142355 4594 scope.go:117] "RemoveContainer" containerID="8058fd5b80bd552757fa759035690c5dcf79e57e4d8b4fef791d9c265d404ea7" Nov 29 06:35:47 crc kubenswrapper[4594]: E1129 06:35:47.142722 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8058fd5b80bd552757fa759035690c5dcf79e57e4d8b4fef791d9c265d404ea7\": container with ID starting with 8058fd5b80bd552757fa759035690c5dcf79e57e4d8b4fef791d9c265d404ea7 not found: ID does not exist" containerID="8058fd5b80bd552757fa759035690c5dcf79e57e4d8b4fef791d9c265d404ea7" Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.142758 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8058fd5b80bd552757fa759035690c5dcf79e57e4d8b4fef791d9c265d404ea7"} err="failed to get container status \"8058fd5b80bd552757fa759035690c5dcf79e57e4d8b4fef791d9c265d404ea7\": rpc error: code = NotFound desc = could not find container \"8058fd5b80bd552757fa759035690c5dcf79e57e4d8b4fef791d9c265d404ea7\": container with ID starting with 8058fd5b80bd552757fa759035690c5dcf79e57e4d8b4fef791d9c265d404ea7 not found: ID does not exist" Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.142785 4594 scope.go:117] "RemoveContainer" containerID="4a4f514bb127c3563188a340635ed66995ac84d74b1b04c87f039836b6f199d5" Nov 29 06:35:47 crc kubenswrapper[4594]: E1129 06:35:47.143030 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a4f514bb127c3563188a340635ed66995ac84d74b1b04c87f039836b6f199d5\": container with ID starting with 4a4f514bb127c3563188a340635ed66995ac84d74b1b04c87f039836b6f199d5 not found: ID does not exist" containerID="4a4f514bb127c3563188a340635ed66995ac84d74b1b04c87f039836b6f199d5" Nov 29 06:35:47 crc kubenswrapper[4594]: I1129 06:35:47.143057 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a4f514bb127c3563188a340635ed66995ac84d74b1b04c87f039836b6f199d5"} err="failed to get container status \"4a4f514bb127c3563188a340635ed66995ac84d74b1b04c87f039836b6f199d5\": rpc error: code = NotFound desc = could not find container \"4a4f514bb127c3563188a340635ed66995ac84d74b1b04c87f039836b6f199d5\": container with ID starting with 4a4f514bb127c3563188a340635ed66995ac84d74b1b04c87f039836b6f199d5 not found: ID does not exist" Nov 29 06:35:48 crc kubenswrapper[4594]: I1129 06:35:48.095870 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a6429ab-e885-4db9-934d-8d588f4dee25" path="/var/lib/kubelet/pods/0a6429ab-e885-4db9-934d-8d588f4dee25/volumes" Nov 29 06:35:53 crc kubenswrapper[4594]: I1129 06:35:53.084531 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:35:53 crc kubenswrapper[4594]: E1129 06:35:53.085858 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:36:04 crc kubenswrapper[4594]: I1129 06:36:04.084096 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:36:04 crc kubenswrapper[4594]: E1129 06:36:04.085070 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:36:11 crc kubenswrapper[4594]: E1129 06:36:11.736026 4594 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.120:37668->192.168.25.120:45015: write tcp 192.168.25.120:37668->192.168.25.120:45015: write: broken pipe Nov 29 06:36:12 crc kubenswrapper[4594]: E1129 06:36:12.537142 4594 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.25.120:37708->192.168.25.120:45015: read tcp 192.168.25.120:37708->192.168.25.120:45015: read: connection reset by peer Nov 29 06:36:17 crc kubenswrapper[4594]: I1129 06:36:17.083094 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:36:17 crc kubenswrapper[4594]: E1129 06:36:17.084192 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:36:30 crc kubenswrapper[4594]: I1129 06:36:30.084662 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:36:30 crc kubenswrapper[4594]: E1129 06:36:30.085709 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:36:45 crc kubenswrapper[4594]: I1129 06:36:45.084468 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:36:45 crc kubenswrapper[4594]: E1129 06:36:45.085560 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:36:57 crc kubenswrapper[4594]: I1129 06:36:57.083641 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:36:57 crc kubenswrapper[4594]: E1129 06:36:57.084943 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:37:11 crc kubenswrapper[4594]: I1129 06:37:11.083760 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:37:11 crc kubenswrapper[4594]: E1129 06:37:11.085683 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:37:24 crc kubenswrapper[4594]: I1129 06:37:24.084086 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:37:25 crc kubenswrapper[4594]: I1129 06:37:25.051739 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"1357d3affb462aa55998dd14fc93fd9a1c6bafde00c8c879ab529e678190969a"} Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.386804 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nsf2g"] Nov 29 06:39:33 crc kubenswrapper[4594]: E1129 06:39:33.387855 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6429ab-e885-4db9-934d-8d588f4dee25" containerName="extract-utilities" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.387871 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6429ab-e885-4db9-934d-8d588f4dee25" containerName="extract-utilities" Nov 29 06:39:33 crc kubenswrapper[4594]: E1129 06:39:33.387885 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6429ab-e885-4db9-934d-8d588f4dee25" containerName="registry-server" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.387892 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6429ab-e885-4db9-934d-8d588f4dee25" containerName="registry-server" Nov 29 06:39:33 crc kubenswrapper[4594]: E1129 06:39:33.387909 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6429ab-e885-4db9-934d-8d588f4dee25" containerName="extract-content" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.387915 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6429ab-e885-4db9-934d-8d588f4dee25" containerName="extract-content" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.388177 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6429ab-e885-4db9-934d-8d588f4dee25" containerName="registry-server" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.390078 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.417862 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nsf2g"] Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.540689 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d426b-5a54-428a-9343-7cd18ce3ab78-utilities\") pod \"certified-operators-nsf2g\" (UID: \"715d426b-5a54-428a-9343-7cd18ce3ab78\") " pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.540829 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d426b-5a54-428a-9343-7cd18ce3ab78-catalog-content\") pod \"certified-operators-nsf2g\" (UID: \"715d426b-5a54-428a-9343-7cd18ce3ab78\") " pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.540883 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwtjd\" (UniqueName: \"kubernetes.io/projected/715d426b-5a54-428a-9343-7cd18ce3ab78-kube-api-access-rwtjd\") pod \"certified-operators-nsf2g\" (UID: \"715d426b-5a54-428a-9343-7cd18ce3ab78\") " pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.643086 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d426b-5a54-428a-9343-7cd18ce3ab78-catalog-content\") pod \"certified-operators-nsf2g\" (UID: \"715d426b-5a54-428a-9343-7cd18ce3ab78\") " pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.643162 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwtjd\" (UniqueName: \"kubernetes.io/projected/715d426b-5a54-428a-9343-7cd18ce3ab78-kube-api-access-rwtjd\") pod \"certified-operators-nsf2g\" (UID: \"715d426b-5a54-428a-9343-7cd18ce3ab78\") " pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.643233 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d426b-5a54-428a-9343-7cd18ce3ab78-utilities\") pod \"certified-operators-nsf2g\" (UID: \"715d426b-5a54-428a-9343-7cd18ce3ab78\") " pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.643633 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d426b-5a54-428a-9343-7cd18ce3ab78-catalog-content\") pod \"certified-operators-nsf2g\" (UID: \"715d426b-5a54-428a-9343-7cd18ce3ab78\") " pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.643747 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d426b-5a54-428a-9343-7cd18ce3ab78-utilities\") pod \"certified-operators-nsf2g\" (UID: \"715d426b-5a54-428a-9343-7cd18ce3ab78\") " pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:33 crc kubenswrapper[4594]: I1129 06:39:33.725322 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwtjd\" (UniqueName: \"kubernetes.io/projected/715d426b-5a54-428a-9343-7cd18ce3ab78-kube-api-access-rwtjd\") pod \"certified-operators-nsf2g\" (UID: \"715d426b-5a54-428a-9343-7cd18ce3ab78\") " pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:34 crc kubenswrapper[4594]: I1129 06:39:34.016803 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:34 crc kubenswrapper[4594]: I1129 06:39:34.455949 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nsf2g"] Nov 29 06:39:35 crc kubenswrapper[4594]: I1129 06:39:35.305104 4594 generic.go:334] "Generic (PLEG): container finished" podID="715d426b-5a54-428a-9343-7cd18ce3ab78" containerID="534741b5a7675fd97f438ab7af3d3a9eadd3ef8d7a5af2b8f074b47494a767f4" exitCode=0 Nov 29 06:39:35 crc kubenswrapper[4594]: I1129 06:39:35.305206 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsf2g" event={"ID":"715d426b-5a54-428a-9343-7cd18ce3ab78","Type":"ContainerDied","Data":"534741b5a7675fd97f438ab7af3d3a9eadd3ef8d7a5af2b8f074b47494a767f4"} Nov 29 06:39:35 crc kubenswrapper[4594]: I1129 06:39:35.305572 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsf2g" event={"ID":"715d426b-5a54-428a-9343-7cd18ce3ab78","Type":"ContainerStarted","Data":"72a8ee0062e642406587e32bdfdf6f080a4bdaa01e2b594efa54acdf7e1e9924"} Nov 29 06:39:35 crc kubenswrapper[4594]: I1129 06:39:35.307814 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 06:39:36 crc kubenswrapper[4594]: I1129 06:39:36.319731 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsf2g" event={"ID":"715d426b-5a54-428a-9343-7cd18ce3ab78","Type":"ContainerStarted","Data":"c824f70c89cc17409e3926a1943410bc915e9b0b37c9fd2a26a4c9473670baee"} Nov 29 06:39:37 crc kubenswrapper[4594]: I1129 06:39:37.330490 4594 generic.go:334] "Generic (PLEG): container finished" podID="715d426b-5a54-428a-9343-7cd18ce3ab78" containerID="c824f70c89cc17409e3926a1943410bc915e9b0b37c9fd2a26a4c9473670baee" exitCode=0 Nov 29 06:39:37 crc kubenswrapper[4594]: I1129 06:39:37.330547 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsf2g" event={"ID":"715d426b-5a54-428a-9343-7cd18ce3ab78","Type":"ContainerDied","Data":"c824f70c89cc17409e3926a1943410bc915e9b0b37c9fd2a26a4c9473670baee"} Nov 29 06:39:38 crc kubenswrapper[4594]: I1129 06:39:38.340164 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsf2g" event={"ID":"715d426b-5a54-428a-9343-7cd18ce3ab78","Type":"ContainerStarted","Data":"79b05d9b092a8e004a2586242eabd6660a8ab6a460aa934ad5aea7487b6d58c1"} Nov 29 06:39:38 crc kubenswrapper[4594]: I1129 06:39:38.362819 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nsf2g" podStartSLOduration=2.6949477379999998 podStartE2EDuration="5.362805111s" podCreationTimestamp="2025-11-29 06:39:33 +0000 UTC" firstStartedPulling="2025-11-29 06:39:35.307526929 +0000 UTC m=+4299.548036150" lastFinishedPulling="2025-11-29 06:39:37.975384303 +0000 UTC m=+4302.215893523" observedRunningTime="2025-11-29 06:39:38.359530787 +0000 UTC m=+4302.600040007" watchObservedRunningTime="2025-11-29 06:39:38.362805111 +0000 UTC m=+4302.603314332" Nov 29 06:39:40 crc kubenswrapper[4594]: I1129 06:39:40.358998 4594 generic.go:334] "Generic (PLEG): container finished" podID="ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad" containerID="79da2dc66d89449b2bbdafcb72784e37c6308918ff21277920f29722555a1081" exitCode=0 Nov 29 06:39:40 crc kubenswrapper[4594]: I1129 06:39:40.359106 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad","Type":"ContainerDied","Data":"79da2dc66d89449b2bbdafcb72784e37c6308918ff21277920f29722555a1081"} Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.700578 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.729362 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-config-data\") pod \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.729488 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjmdh\" (UniqueName: \"kubernetes.io/projected/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-kube-api-access-zjmdh\") pod \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.729577 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-openstack-config\") pod \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.729639 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-openstack-config-secret\") pod \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.729705 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-test-operator-ephemeral-temporary\") pod \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.729778 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-ssh-key\") pod \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.729869 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-ca-certs\") pod \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.729901 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.729932 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-test-operator-ephemeral-workdir\") pod \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\" (UID: \"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad\") " Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.730449 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-config-data" (OuterVolumeSpecName: "config-data") pod "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad" (UID: "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.735511 4594 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.739080 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad" (UID: "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.742472 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad" (UID: "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.746755 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-kube-api-access-zjmdh" (OuterVolumeSpecName: "kube-api-access-zjmdh") pod "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad" (UID: "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad"). InnerVolumeSpecName "kube-api-access-zjmdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.768265 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad" (UID: "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.777136 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad" (UID: "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.779240 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad" (UID: "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.791854 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad" (UID: "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.799304 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad" (UID: "ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.838521 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjmdh\" (UniqueName: \"kubernetes.io/projected/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-kube-api-access-zjmdh\") on node \"crc\" DevicePath \"\"" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.838556 4594 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.838566 4594 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.838580 4594 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.838594 4594 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.838604 4594 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-ca-certs\") on node \"crc\" DevicePath \"\"" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.838647 4594 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.838662 4594 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.860230 4594 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 29 06:39:41 crc kubenswrapper[4594]: I1129 06:39:41.942210 4594 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 29 06:39:42 crc kubenswrapper[4594]: I1129 06:39:42.381154 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad","Type":"ContainerDied","Data":"0618685a982b09ad35417b449a632512cfac05da0caf927554bcc12abf3ab382"} Nov 29 06:39:42 crc kubenswrapper[4594]: I1129 06:39:42.381198 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0618685a982b09ad35417b449a632512cfac05da0caf927554bcc12abf3ab382" Nov 29 06:39:42 crc kubenswrapper[4594]: I1129 06:39:42.381239 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 29 06:39:44 crc kubenswrapper[4594]: I1129 06:39:44.017017 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:44 crc kubenswrapper[4594]: I1129 06:39:44.017365 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:44 crc kubenswrapper[4594]: I1129 06:39:44.539720 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:44 crc kubenswrapper[4594]: I1129 06:39:44.584201 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:45 crc kubenswrapper[4594]: I1129 06:39:45.800436 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:39:45 crc kubenswrapper[4594]: I1129 06:39:45.800519 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:39:46 crc kubenswrapper[4594]: I1129 06:39:46.178385 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nsf2g"] Nov 29 06:39:46 crc kubenswrapper[4594]: I1129 06:39:46.418743 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nsf2g" podUID="715d426b-5a54-428a-9343-7cd18ce3ab78" containerName="registry-server" containerID="cri-o://79b05d9b092a8e004a2586242eabd6660a8ab6a460aa934ad5aea7487b6d58c1" gracePeriod=2 Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.336558 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.361743 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d426b-5a54-428a-9343-7cd18ce3ab78-catalog-content\") pod \"715d426b-5a54-428a-9343-7cd18ce3ab78\" (UID: \"715d426b-5a54-428a-9343-7cd18ce3ab78\") " Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.361880 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwtjd\" (UniqueName: \"kubernetes.io/projected/715d426b-5a54-428a-9343-7cd18ce3ab78-kube-api-access-rwtjd\") pod \"715d426b-5a54-428a-9343-7cd18ce3ab78\" (UID: \"715d426b-5a54-428a-9343-7cd18ce3ab78\") " Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.362211 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d426b-5a54-428a-9343-7cd18ce3ab78-utilities\") pod \"715d426b-5a54-428a-9343-7cd18ce3ab78\" (UID: \"715d426b-5a54-428a-9343-7cd18ce3ab78\") " Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.364133 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715d426b-5a54-428a-9343-7cd18ce3ab78-utilities" (OuterVolumeSpecName: "utilities") pod "715d426b-5a54-428a-9343-7cd18ce3ab78" (UID: "715d426b-5a54-428a-9343-7cd18ce3ab78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.369953 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715d426b-5a54-428a-9343-7cd18ce3ab78-kube-api-access-rwtjd" (OuterVolumeSpecName: "kube-api-access-rwtjd") pod "715d426b-5a54-428a-9343-7cd18ce3ab78" (UID: "715d426b-5a54-428a-9343-7cd18ce3ab78"). InnerVolumeSpecName "kube-api-access-rwtjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.407979 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715d426b-5a54-428a-9343-7cd18ce3ab78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "715d426b-5a54-428a-9343-7cd18ce3ab78" (UID: "715d426b-5a54-428a-9343-7cd18ce3ab78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.429122 4594 generic.go:334] "Generic (PLEG): container finished" podID="715d426b-5a54-428a-9343-7cd18ce3ab78" containerID="79b05d9b092a8e004a2586242eabd6660a8ab6a460aa934ad5aea7487b6d58c1" exitCode=0 Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.429183 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsf2g" event={"ID":"715d426b-5a54-428a-9343-7cd18ce3ab78","Type":"ContainerDied","Data":"79b05d9b092a8e004a2586242eabd6660a8ab6a460aa934ad5aea7487b6d58c1"} Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.429214 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nsf2g" event={"ID":"715d426b-5a54-428a-9343-7cd18ce3ab78","Type":"ContainerDied","Data":"72a8ee0062e642406587e32bdfdf6f080a4bdaa01e2b594efa54acdf7e1e9924"} Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.429221 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nsf2g" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.429233 4594 scope.go:117] "RemoveContainer" containerID="79b05d9b092a8e004a2586242eabd6660a8ab6a460aa934ad5aea7487b6d58c1" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.449499 4594 scope.go:117] "RemoveContainer" containerID="c824f70c89cc17409e3926a1943410bc915e9b0b37c9fd2a26a4c9473670baee" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.458670 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nsf2g"] Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.464830 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d426b-5a54-428a-9343-7cd18ce3ab78-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.464862 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d426b-5a54-428a-9343-7cd18ce3ab78-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.464874 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwtjd\" (UniqueName: \"kubernetes.io/projected/715d426b-5a54-428a-9343-7cd18ce3ab78-kube-api-access-rwtjd\") on node \"crc\" DevicePath \"\"" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.466181 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nsf2g"] Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.497702 4594 scope.go:117] "RemoveContainer" containerID="534741b5a7675fd97f438ab7af3d3a9eadd3ef8d7a5af2b8f074b47494a767f4" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.527119 4594 scope.go:117] "RemoveContainer" containerID="79b05d9b092a8e004a2586242eabd6660a8ab6a460aa934ad5aea7487b6d58c1" Nov 29 06:39:47 crc kubenswrapper[4594]: E1129 06:39:47.527744 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b05d9b092a8e004a2586242eabd6660a8ab6a460aa934ad5aea7487b6d58c1\": container with ID starting with 79b05d9b092a8e004a2586242eabd6660a8ab6a460aa934ad5aea7487b6d58c1 not found: ID does not exist" containerID="79b05d9b092a8e004a2586242eabd6660a8ab6a460aa934ad5aea7487b6d58c1" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.527848 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b05d9b092a8e004a2586242eabd6660a8ab6a460aa934ad5aea7487b6d58c1"} err="failed to get container status \"79b05d9b092a8e004a2586242eabd6660a8ab6a460aa934ad5aea7487b6d58c1\": rpc error: code = NotFound desc = could not find container \"79b05d9b092a8e004a2586242eabd6660a8ab6a460aa934ad5aea7487b6d58c1\": container with ID starting with 79b05d9b092a8e004a2586242eabd6660a8ab6a460aa934ad5aea7487b6d58c1 not found: ID does not exist" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.527930 4594 scope.go:117] "RemoveContainer" containerID="c824f70c89cc17409e3926a1943410bc915e9b0b37c9fd2a26a4c9473670baee" Nov 29 06:39:47 crc kubenswrapper[4594]: E1129 06:39:47.528281 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c824f70c89cc17409e3926a1943410bc915e9b0b37c9fd2a26a4c9473670baee\": container with ID starting with c824f70c89cc17409e3926a1943410bc915e9b0b37c9fd2a26a4c9473670baee not found: ID does not exist" containerID="c824f70c89cc17409e3926a1943410bc915e9b0b37c9fd2a26a4c9473670baee" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.528330 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c824f70c89cc17409e3926a1943410bc915e9b0b37c9fd2a26a4c9473670baee"} err="failed to get container status \"c824f70c89cc17409e3926a1943410bc915e9b0b37c9fd2a26a4c9473670baee\": rpc error: code = NotFound desc = could not find container \"c824f70c89cc17409e3926a1943410bc915e9b0b37c9fd2a26a4c9473670baee\": container with ID starting with c824f70c89cc17409e3926a1943410bc915e9b0b37c9fd2a26a4c9473670baee not found: ID does not exist" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.528358 4594 scope.go:117] "RemoveContainer" containerID="534741b5a7675fd97f438ab7af3d3a9eadd3ef8d7a5af2b8f074b47494a767f4" Nov 29 06:39:47 crc kubenswrapper[4594]: E1129 06:39:47.528973 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"534741b5a7675fd97f438ab7af3d3a9eadd3ef8d7a5af2b8f074b47494a767f4\": container with ID starting with 534741b5a7675fd97f438ab7af3d3a9eadd3ef8d7a5af2b8f074b47494a767f4 not found: ID does not exist" containerID="534741b5a7675fd97f438ab7af3d3a9eadd3ef8d7a5af2b8f074b47494a767f4" Nov 29 06:39:47 crc kubenswrapper[4594]: I1129 06:39:47.529002 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"534741b5a7675fd97f438ab7af3d3a9eadd3ef8d7a5af2b8f074b47494a767f4"} err="failed to get container status \"534741b5a7675fd97f438ab7af3d3a9eadd3ef8d7a5af2b8f074b47494a767f4\": rpc error: code = NotFound desc = could not find container \"534741b5a7675fd97f438ab7af3d3a9eadd3ef8d7a5af2b8f074b47494a767f4\": container with ID starting with 534741b5a7675fd97f438ab7af3d3a9eadd3ef8d7a5af2b8f074b47494a767f4 not found: ID does not exist" Nov 29 06:39:48 crc kubenswrapper[4594]: I1129 06:39:48.096537 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715d426b-5a54-428a-9343-7cd18ce3ab78" path="/var/lib/kubelet/pods/715d426b-5a54-428a-9343-7cd18ce3ab78/volumes" Nov 29 06:39:50 crc kubenswrapper[4594]: I1129 06:39:50.815157 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 29 06:39:50 crc kubenswrapper[4594]: E1129 06:39:50.815966 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715d426b-5a54-428a-9343-7cd18ce3ab78" containerName="extract-utilities" Nov 29 06:39:50 crc kubenswrapper[4594]: I1129 06:39:50.815982 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="715d426b-5a54-428a-9343-7cd18ce3ab78" containerName="extract-utilities" Nov 29 06:39:50 crc kubenswrapper[4594]: E1129 06:39:50.816001 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad" containerName="tempest-tests-tempest-tests-runner" Nov 29 06:39:50 crc kubenswrapper[4594]: I1129 06:39:50.816009 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad" containerName="tempest-tests-tempest-tests-runner" Nov 29 06:39:50 crc kubenswrapper[4594]: E1129 06:39:50.816018 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715d426b-5a54-428a-9343-7cd18ce3ab78" containerName="extract-content" Nov 29 06:39:50 crc kubenswrapper[4594]: I1129 06:39:50.816024 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="715d426b-5a54-428a-9343-7cd18ce3ab78" containerName="extract-content" Nov 29 06:39:50 crc kubenswrapper[4594]: E1129 06:39:50.816057 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715d426b-5a54-428a-9343-7cd18ce3ab78" containerName="registry-server" Nov 29 06:39:50 crc kubenswrapper[4594]: I1129 06:39:50.816062 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="715d426b-5a54-428a-9343-7cd18ce3ab78" containerName="registry-server" Nov 29 06:39:50 crc kubenswrapper[4594]: I1129 06:39:50.816359 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad" containerName="tempest-tests-tempest-tests-runner" Nov 29 06:39:50 crc kubenswrapper[4594]: I1129 06:39:50.816378 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="715d426b-5a54-428a-9343-7cd18ce3ab78" containerName="registry-server" Nov 29 06:39:50 crc kubenswrapper[4594]: I1129 06:39:50.817238 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 06:39:50 crc kubenswrapper[4594]: I1129 06:39:50.819588 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lm66r" Nov 29 06:39:50 crc kubenswrapper[4594]: I1129 06:39:50.822271 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 29 06:39:50 crc kubenswrapper[4594]: I1129 06:39:50.937165 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ceac441b-e510-49a4-aefb-26fad57552d2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 06:39:50 crc kubenswrapper[4594]: I1129 06:39:50.937440 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9g46\" (UniqueName: \"kubernetes.io/projected/ceac441b-e510-49a4-aefb-26fad57552d2-kube-api-access-z9g46\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ceac441b-e510-49a4-aefb-26fad57552d2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 06:39:51 crc kubenswrapper[4594]: I1129 06:39:51.040441 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ceac441b-e510-49a4-aefb-26fad57552d2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 06:39:51 crc kubenswrapper[4594]: I1129 06:39:51.040511 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9g46\" (UniqueName: \"kubernetes.io/projected/ceac441b-e510-49a4-aefb-26fad57552d2-kube-api-access-z9g46\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ceac441b-e510-49a4-aefb-26fad57552d2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 06:39:51 crc kubenswrapper[4594]: I1129 06:39:51.041040 4594 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ceac441b-e510-49a4-aefb-26fad57552d2\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 06:39:51 crc kubenswrapper[4594]: I1129 06:39:51.059510 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9g46\" (UniqueName: \"kubernetes.io/projected/ceac441b-e510-49a4-aefb-26fad57552d2-kube-api-access-z9g46\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ceac441b-e510-49a4-aefb-26fad57552d2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 06:39:51 crc kubenswrapper[4594]: I1129 06:39:51.068930 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ceac441b-e510-49a4-aefb-26fad57552d2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 06:39:51 crc kubenswrapper[4594]: I1129 06:39:51.143395 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 06:39:51 crc kubenswrapper[4594]: I1129 06:39:51.627845 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 29 06:39:52 crc kubenswrapper[4594]: I1129 06:39:52.477780 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ceac441b-e510-49a4-aefb-26fad57552d2","Type":"ContainerStarted","Data":"1fddaef9cfb22bc84348bbc1507388278b25a9f9fc173d80a9e36742d0092447"} Nov 29 06:39:53 crc kubenswrapper[4594]: I1129 06:39:53.490741 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ceac441b-e510-49a4-aefb-26fad57552d2","Type":"ContainerStarted","Data":"961a95fd792587d75540af5cf61aa0348a879eaf8fcee0bbadb2fd004bb29472"} Nov 29 06:39:53 crc kubenswrapper[4594]: I1129 06:39:53.507705 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.5336048030000002 podStartE2EDuration="3.507687626s" podCreationTimestamp="2025-11-29 06:39:50 +0000 UTC" firstStartedPulling="2025-11-29 06:39:51.6374238 +0000 UTC m=+4315.877933020" lastFinishedPulling="2025-11-29 06:39:52.611506623 +0000 UTC m=+4316.852015843" observedRunningTime="2025-11-29 06:39:53.502830646 +0000 UTC m=+4317.743339856" watchObservedRunningTime="2025-11-29 06:39:53.507687626 +0000 UTC m=+4317.748196846" Nov 29 06:40:12 crc kubenswrapper[4594]: I1129 06:40:12.387826 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x47cb/must-gather-2dptg"] Nov 29 06:40:12 crc kubenswrapper[4594]: I1129 06:40:12.391853 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/must-gather-2dptg" Nov 29 06:40:12 crc kubenswrapper[4594]: I1129 06:40:12.397592 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-x47cb"/"default-dockercfg-cwndc" Nov 29 06:40:12 crc kubenswrapper[4594]: I1129 06:40:12.397613 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x47cb"/"openshift-service-ca.crt" Nov 29 06:40:12 crc kubenswrapper[4594]: I1129 06:40:12.397644 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x47cb"/"kube-root-ca.crt" Nov 29 06:40:12 crc kubenswrapper[4594]: I1129 06:40:12.400223 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x47cb/must-gather-2dptg"] Nov 29 06:40:12 crc kubenswrapper[4594]: I1129 06:40:12.413856 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhh6c\" (UniqueName: \"kubernetes.io/projected/70da90c5-e357-44f5-8f87-d2b63c1edc68-kube-api-access-rhh6c\") pod \"must-gather-2dptg\" (UID: \"70da90c5-e357-44f5-8f87-d2b63c1edc68\") " pod="openshift-must-gather-x47cb/must-gather-2dptg" Nov 29 06:40:12 crc kubenswrapper[4594]: I1129 06:40:12.413957 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70da90c5-e357-44f5-8f87-d2b63c1edc68-must-gather-output\") pod \"must-gather-2dptg\" (UID: \"70da90c5-e357-44f5-8f87-d2b63c1edc68\") " pod="openshift-must-gather-x47cb/must-gather-2dptg" Nov 29 06:40:12 crc kubenswrapper[4594]: I1129 06:40:12.516058 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70da90c5-e357-44f5-8f87-d2b63c1edc68-must-gather-output\") pod \"must-gather-2dptg\" (UID: \"70da90c5-e357-44f5-8f87-d2b63c1edc68\") " pod="openshift-must-gather-x47cb/must-gather-2dptg" Nov 29 06:40:12 crc kubenswrapper[4594]: I1129 06:40:12.516242 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhh6c\" (UniqueName: \"kubernetes.io/projected/70da90c5-e357-44f5-8f87-d2b63c1edc68-kube-api-access-rhh6c\") pod \"must-gather-2dptg\" (UID: \"70da90c5-e357-44f5-8f87-d2b63c1edc68\") " pod="openshift-must-gather-x47cb/must-gather-2dptg" Nov 29 06:40:12 crc kubenswrapper[4594]: I1129 06:40:12.517214 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70da90c5-e357-44f5-8f87-d2b63c1edc68-must-gather-output\") pod \"must-gather-2dptg\" (UID: \"70da90c5-e357-44f5-8f87-d2b63c1edc68\") " pod="openshift-must-gather-x47cb/must-gather-2dptg" Nov 29 06:40:12 crc kubenswrapper[4594]: I1129 06:40:12.536126 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhh6c\" (UniqueName: \"kubernetes.io/projected/70da90c5-e357-44f5-8f87-d2b63c1edc68-kube-api-access-rhh6c\") pod \"must-gather-2dptg\" (UID: \"70da90c5-e357-44f5-8f87-d2b63c1edc68\") " pod="openshift-must-gather-x47cb/must-gather-2dptg" Nov 29 06:40:12 crc kubenswrapper[4594]: I1129 06:40:12.716068 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/must-gather-2dptg" Nov 29 06:40:13 crc kubenswrapper[4594]: I1129 06:40:13.170558 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x47cb/must-gather-2dptg"] Nov 29 06:40:13 crc kubenswrapper[4594]: W1129 06:40:13.171392 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70da90c5_e357_44f5_8f87_d2b63c1edc68.slice/crio-a491e197c6edc6c7da6dec79192fd3aa7ad271817974b5321f50592c37641e31 WatchSource:0}: Error finding container a491e197c6edc6c7da6dec79192fd3aa7ad271817974b5321f50592c37641e31: Status 404 returned error can't find the container with id a491e197c6edc6c7da6dec79192fd3aa7ad271817974b5321f50592c37641e31 Nov 29 06:40:13 crc kubenswrapper[4594]: I1129 06:40:13.675215 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x47cb/must-gather-2dptg" event={"ID":"70da90c5-e357-44f5-8f87-d2b63c1edc68","Type":"ContainerStarted","Data":"a491e197c6edc6c7da6dec79192fd3aa7ad271817974b5321f50592c37641e31"} Nov 29 06:40:15 crc kubenswrapper[4594]: I1129 06:40:15.799980 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:40:15 crc kubenswrapper[4594]: I1129 06:40:15.800411 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:40:18 crc kubenswrapper[4594]: I1129 06:40:18.723084 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x47cb/must-gather-2dptg" event={"ID":"70da90c5-e357-44f5-8f87-d2b63c1edc68","Type":"ContainerStarted","Data":"9ff0d7c88661cf1adbb32216d56dfc0de7d8d85592b38f08981d7cef375f2860"} Nov 29 06:40:19 crc kubenswrapper[4594]: I1129 06:40:19.734863 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x47cb/must-gather-2dptg" event={"ID":"70da90c5-e357-44f5-8f87-d2b63c1edc68","Type":"ContainerStarted","Data":"7175a03672eec6b008bf44a0701c9edcb72c9c1be5c592d78b1714423a7d2c88"} Nov 29 06:40:19 crc kubenswrapper[4594]: I1129 06:40:19.756743 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x47cb/must-gather-2dptg" podStartSLOduration=2.557379408 podStartE2EDuration="7.756725024s" podCreationTimestamp="2025-11-29 06:40:12 +0000 UTC" firstStartedPulling="2025-11-29 06:40:13.172568022 +0000 UTC m=+4337.413077241" lastFinishedPulling="2025-11-29 06:40:18.371913637 +0000 UTC m=+4342.612422857" observedRunningTime="2025-11-29 06:40:19.750882881 +0000 UTC m=+4343.991392101" watchObservedRunningTime="2025-11-29 06:40:19.756725024 +0000 UTC m=+4343.997234244" Nov 29 06:40:21 crc kubenswrapper[4594]: I1129 06:40:21.610536 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x47cb/crc-debug-fd6km"] Nov 29 06:40:21 crc kubenswrapper[4594]: I1129 06:40:21.612847 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/crc-debug-fd6km" Nov 29 06:40:21 crc kubenswrapper[4594]: I1129 06:40:21.633810 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxwkm\" (UniqueName: \"kubernetes.io/projected/fb7fd32e-aebd-4c49-86d1-fc118466a761-kube-api-access-vxwkm\") pod \"crc-debug-fd6km\" (UID: \"fb7fd32e-aebd-4c49-86d1-fc118466a761\") " pod="openshift-must-gather-x47cb/crc-debug-fd6km" Nov 29 06:40:21 crc kubenswrapper[4594]: I1129 06:40:21.634007 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb7fd32e-aebd-4c49-86d1-fc118466a761-host\") pod \"crc-debug-fd6km\" (UID: \"fb7fd32e-aebd-4c49-86d1-fc118466a761\") " pod="openshift-must-gather-x47cb/crc-debug-fd6km" Nov 29 06:40:21 crc kubenswrapper[4594]: I1129 06:40:21.736940 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxwkm\" (UniqueName: \"kubernetes.io/projected/fb7fd32e-aebd-4c49-86d1-fc118466a761-kube-api-access-vxwkm\") pod \"crc-debug-fd6km\" (UID: \"fb7fd32e-aebd-4c49-86d1-fc118466a761\") " pod="openshift-must-gather-x47cb/crc-debug-fd6km" Nov 29 06:40:21 crc kubenswrapper[4594]: I1129 06:40:21.737033 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb7fd32e-aebd-4c49-86d1-fc118466a761-host\") pod \"crc-debug-fd6km\" (UID: \"fb7fd32e-aebd-4c49-86d1-fc118466a761\") " pod="openshift-must-gather-x47cb/crc-debug-fd6km" Nov 29 06:40:21 crc kubenswrapper[4594]: I1129 06:40:21.737157 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb7fd32e-aebd-4c49-86d1-fc118466a761-host\") pod \"crc-debug-fd6km\" (UID: \"fb7fd32e-aebd-4c49-86d1-fc118466a761\") " pod="openshift-must-gather-x47cb/crc-debug-fd6km" Nov 29 06:40:21 crc kubenswrapper[4594]: I1129 06:40:21.756151 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxwkm\" (UniqueName: \"kubernetes.io/projected/fb7fd32e-aebd-4c49-86d1-fc118466a761-kube-api-access-vxwkm\") pod \"crc-debug-fd6km\" (UID: \"fb7fd32e-aebd-4c49-86d1-fc118466a761\") " pod="openshift-must-gather-x47cb/crc-debug-fd6km" Nov 29 06:40:21 crc kubenswrapper[4594]: I1129 06:40:21.934357 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/crc-debug-fd6km" Nov 29 06:40:22 crc kubenswrapper[4594]: I1129 06:40:22.761181 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x47cb/crc-debug-fd6km" event={"ID":"fb7fd32e-aebd-4c49-86d1-fc118466a761","Type":"ContainerStarted","Data":"54958a1dc455b100fa3eb2cf75858ad71ef1f651538e7d95d9a46d47d6c1bde9"} Nov 29 06:40:31 crc kubenswrapper[4594]: I1129 06:40:31.840002 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x47cb/crc-debug-fd6km" event={"ID":"fb7fd32e-aebd-4c49-86d1-fc118466a761","Type":"ContainerStarted","Data":"d12281137fdfebb1cd0a2f8a0ef86fca8db90a116c76c35fb119bd8816bce66d"} Nov 29 06:40:31 crc kubenswrapper[4594]: I1129 06:40:31.854695 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x47cb/crc-debug-fd6km" podStartSLOduration=1.798020089 podStartE2EDuration="10.854678342s" podCreationTimestamp="2025-11-29 06:40:21 +0000 UTC" firstStartedPulling="2025-11-29 06:40:21.990838987 +0000 UTC m=+4346.231348207" lastFinishedPulling="2025-11-29 06:40:31.047497241 +0000 UTC m=+4355.288006460" observedRunningTime="2025-11-29 06:40:31.85076171 +0000 UTC m=+4356.091270929" watchObservedRunningTime="2025-11-29 06:40:31.854678342 +0000 UTC m=+4356.095187552" Nov 29 06:40:45 crc kubenswrapper[4594]: I1129 06:40:45.801039 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:40:45 crc kubenswrapper[4594]: I1129 06:40:45.801884 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:40:45 crc kubenswrapper[4594]: I1129 06:40:45.801969 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 06:40:45 crc kubenswrapper[4594]: I1129 06:40:45.802802 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1357d3affb462aa55998dd14fc93fd9a1c6bafde00c8c879ab529e678190969a"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 06:40:45 crc kubenswrapper[4594]: I1129 06:40:45.802870 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://1357d3affb462aa55998dd14fc93fd9a1c6bafde00c8c879ab529e678190969a" gracePeriod=600 Nov 29 06:40:45 crc kubenswrapper[4594]: I1129 06:40:45.978158 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="1357d3affb462aa55998dd14fc93fd9a1c6bafde00c8c879ab529e678190969a" exitCode=0 Nov 29 06:40:45 crc kubenswrapper[4594]: I1129 06:40:45.978457 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"1357d3affb462aa55998dd14fc93fd9a1c6bafde00c8c879ab529e678190969a"} Nov 29 06:40:45 crc kubenswrapper[4594]: I1129 06:40:45.978496 4594 scope.go:117] "RemoveContainer" containerID="10fd441ce81e44bfd8f98a7b676e40e142716adff562f75e1eca845e725191d3" Nov 29 06:40:46 crc kubenswrapper[4594]: I1129 06:40:46.989735 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84"} Nov 29 06:41:10 crc kubenswrapper[4594]: E1129 06:41:10.653109 4594 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb7fd32e_aebd_4c49_86d1_fc118466a761.slice/crio-d12281137fdfebb1cd0a2f8a0ef86fca8db90a116c76c35fb119bd8816bce66d.scope\": RecentStats: unable to find data in memory cache]" Nov 29 06:41:11 crc kubenswrapper[4594]: I1129 06:41:11.200183 4594 generic.go:334] "Generic (PLEG): container finished" podID="fb7fd32e-aebd-4c49-86d1-fc118466a761" containerID="d12281137fdfebb1cd0a2f8a0ef86fca8db90a116c76c35fb119bd8816bce66d" exitCode=0 Nov 29 06:41:11 crc kubenswrapper[4594]: I1129 06:41:11.200273 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x47cb/crc-debug-fd6km" event={"ID":"fb7fd32e-aebd-4c49-86d1-fc118466a761","Type":"ContainerDied","Data":"d12281137fdfebb1cd0a2f8a0ef86fca8db90a116c76c35fb119bd8816bce66d"} Nov 29 06:41:12 crc kubenswrapper[4594]: I1129 06:41:12.300048 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/crc-debug-fd6km" Nov 29 06:41:12 crc kubenswrapper[4594]: I1129 06:41:12.337192 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x47cb/crc-debug-fd6km"] Nov 29 06:41:12 crc kubenswrapper[4594]: I1129 06:41:12.344805 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x47cb/crc-debug-fd6km"] Nov 29 06:41:12 crc kubenswrapper[4594]: I1129 06:41:12.479988 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb7fd32e-aebd-4c49-86d1-fc118466a761-host\") pod \"fb7fd32e-aebd-4c49-86d1-fc118466a761\" (UID: \"fb7fd32e-aebd-4c49-86d1-fc118466a761\") " Nov 29 06:41:12 crc kubenswrapper[4594]: I1129 06:41:12.480107 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb7fd32e-aebd-4c49-86d1-fc118466a761-host" (OuterVolumeSpecName: "host") pod "fb7fd32e-aebd-4c49-86d1-fc118466a761" (UID: "fb7fd32e-aebd-4c49-86d1-fc118466a761"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 06:41:12 crc kubenswrapper[4594]: I1129 06:41:12.480289 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxwkm\" (UniqueName: \"kubernetes.io/projected/fb7fd32e-aebd-4c49-86d1-fc118466a761-kube-api-access-vxwkm\") pod \"fb7fd32e-aebd-4c49-86d1-fc118466a761\" (UID: \"fb7fd32e-aebd-4c49-86d1-fc118466a761\") " Nov 29 06:41:12 crc kubenswrapper[4594]: I1129 06:41:12.481373 4594 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb7fd32e-aebd-4c49-86d1-fc118466a761-host\") on node \"crc\" DevicePath \"\"" Nov 29 06:41:12 crc kubenswrapper[4594]: I1129 06:41:12.488742 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb7fd32e-aebd-4c49-86d1-fc118466a761-kube-api-access-vxwkm" (OuterVolumeSpecName: "kube-api-access-vxwkm") pod "fb7fd32e-aebd-4c49-86d1-fc118466a761" (UID: "fb7fd32e-aebd-4c49-86d1-fc118466a761"). InnerVolumeSpecName "kube-api-access-vxwkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:41:12 crc kubenswrapper[4594]: I1129 06:41:12.584632 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxwkm\" (UniqueName: \"kubernetes.io/projected/fb7fd32e-aebd-4c49-86d1-fc118466a761-kube-api-access-vxwkm\") on node \"crc\" DevicePath \"\"" Nov 29 06:41:13 crc kubenswrapper[4594]: I1129 06:41:13.218979 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54958a1dc455b100fa3eb2cf75858ad71ef1f651538e7d95d9a46d47d6c1bde9" Nov 29 06:41:13 crc kubenswrapper[4594]: I1129 06:41:13.219079 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/crc-debug-fd6km" Nov 29 06:41:13 crc kubenswrapper[4594]: I1129 06:41:13.531550 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x47cb/crc-debug-l8nnd"] Nov 29 06:41:13 crc kubenswrapper[4594]: E1129 06:41:13.531902 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7fd32e-aebd-4c49-86d1-fc118466a761" containerName="container-00" Nov 29 06:41:13 crc kubenswrapper[4594]: I1129 06:41:13.531915 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7fd32e-aebd-4c49-86d1-fc118466a761" containerName="container-00" Nov 29 06:41:13 crc kubenswrapper[4594]: I1129 06:41:13.532107 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb7fd32e-aebd-4c49-86d1-fc118466a761" containerName="container-00" Nov 29 06:41:13 crc kubenswrapper[4594]: I1129 06:41:13.532766 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/crc-debug-l8nnd" Nov 29 06:41:13 crc kubenswrapper[4594]: I1129 06:41:13.706630 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-845m6\" (UniqueName: \"kubernetes.io/projected/0761959a-d227-457b-9b4e-b6874b69133b-kube-api-access-845m6\") pod \"crc-debug-l8nnd\" (UID: \"0761959a-d227-457b-9b4e-b6874b69133b\") " pod="openshift-must-gather-x47cb/crc-debug-l8nnd" Nov 29 06:41:13 crc kubenswrapper[4594]: I1129 06:41:13.706809 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0761959a-d227-457b-9b4e-b6874b69133b-host\") pod \"crc-debug-l8nnd\" (UID: \"0761959a-d227-457b-9b4e-b6874b69133b\") " pod="openshift-must-gather-x47cb/crc-debug-l8nnd" Nov 29 06:41:13 crc kubenswrapper[4594]: I1129 06:41:13.808466 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-845m6\" (UniqueName: \"kubernetes.io/projected/0761959a-d227-457b-9b4e-b6874b69133b-kube-api-access-845m6\") pod \"crc-debug-l8nnd\" (UID: \"0761959a-d227-457b-9b4e-b6874b69133b\") " pod="openshift-must-gather-x47cb/crc-debug-l8nnd" Nov 29 06:41:13 crc kubenswrapper[4594]: I1129 06:41:13.808571 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0761959a-d227-457b-9b4e-b6874b69133b-host\") pod \"crc-debug-l8nnd\" (UID: \"0761959a-d227-457b-9b4e-b6874b69133b\") " pod="openshift-must-gather-x47cb/crc-debug-l8nnd" Nov 29 06:41:13 crc kubenswrapper[4594]: I1129 06:41:13.808666 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0761959a-d227-457b-9b4e-b6874b69133b-host\") pod \"crc-debug-l8nnd\" (UID: \"0761959a-d227-457b-9b4e-b6874b69133b\") " pod="openshift-must-gather-x47cb/crc-debug-l8nnd" Nov 29 06:41:13 crc kubenswrapper[4594]: I1129 06:41:13.827202 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-845m6\" (UniqueName: \"kubernetes.io/projected/0761959a-d227-457b-9b4e-b6874b69133b-kube-api-access-845m6\") pod \"crc-debug-l8nnd\" (UID: \"0761959a-d227-457b-9b4e-b6874b69133b\") " pod="openshift-must-gather-x47cb/crc-debug-l8nnd" Nov 29 06:41:13 crc kubenswrapper[4594]: I1129 06:41:13.847658 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/crc-debug-l8nnd" Nov 29 06:41:14 crc kubenswrapper[4594]: I1129 06:41:14.096732 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb7fd32e-aebd-4c49-86d1-fc118466a761" path="/var/lib/kubelet/pods/fb7fd32e-aebd-4c49-86d1-fc118466a761/volumes" Nov 29 06:41:14 crc kubenswrapper[4594]: I1129 06:41:14.230413 4594 generic.go:334] "Generic (PLEG): container finished" podID="0761959a-d227-457b-9b4e-b6874b69133b" containerID="fc1d4b6b40c74504d9bb87ad23c8ba1bbfca64e71d687766d7f89bb8dfa7a7d0" exitCode=0 Nov 29 06:41:14 crc kubenswrapper[4594]: I1129 06:41:14.230470 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x47cb/crc-debug-l8nnd" event={"ID":"0761959a-d227-457b-9b4e-b6874b69133b","Type":"ContainerDied","Data":"fc1d4b6b40c74504d9bb87ad23c8ba1bbfca64e71d687766d7f89bb8dfa7a7d0"} Nov 29 06:41:14 crc kubenswrapper[4594]: I1129 06:41:14.230507 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x47cb/crc-debug-l8nnd" event={"ID":"0761959a-d227-457b-9b4e-b6874b69133b","Type":"ContainerStarted","Data":"e552ed58023c95bd0e803b7247089d74cdb3422f2c25f37352589d7c25df73db"} Nov 29 06:41:15 crc kubenswrapper[4594]: I1129 06:41:15.327613 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/crc-debug-l8nnd" Nov 29 06:41:15 crc kubenswrapper[4594]: I1129 06:41:15.443025 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0761959a-d227-457b-9b4e-b6874b69133b-host\") pod \"0761959a-d227-457b-9b4e-b6874b69133b\" (UID: \"0761959a-d227-457b-9b4e-b6874b69133b\") " Nov 29 06:41:15 crc kubenswrapper[4594]: I1129 06:41:15.443207 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0761959a-d227-457b-9b4e-b6874b69133b-host" (OuterVolumeSpecName: "host") pod "0761959a-d227-457b-9b4e-b6874b69133b" (UID: "0761959a-d227-457b-9b4e-b6874b69133b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 06:41:15 crc kubenswrapper[4594]: I1129 06:41:15.443241 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-845m6\" (UniqueName: \"kubernetes.io/projected/0761959a-d227-457b-9b4e-b6874b69133b-kube-api-access-845m6\") pod \"0761959a-d227-457b-9b4e-b6874b69133b\" (UID: \"0761959a-d227-457b-9b4e-b6874b69133b\") " Nov 29 06:41:15 crc kubenswrapper[4594]: I1129 06:41:15.444092 4594 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0761959a-d227-457b-9b4e-b6874b69133b-host\") on node \"crc\" DevicePath \"\"" Nov 29 06:41:15 crc kubenswrapper[4594]: I1129 06:41:15.447470 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0761959a-d227-457b-9b4e-b6874b69133b-kube-api-access-845m6" (OuterVolumeSpecName: "kube-api-access-845m6") pod "0761959a-d227-457b-9b4e-b6874b69133b" (UID: "0761959a-d227-457b-9b4e-b6874b69133b"). InnerVolumeSpecName "kube-api-access-845m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:41:15 crc kubenswrapper[4594]: I1129 06:41:15.545849 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-845m6\" (UniqueName: \"kubernetes.io/projected/0761959a-d227-457b-9b4e-b6874b69133b-kube-api-access-845m6\") on node \"crc\" DevicePath \"\"" Nov 29 06:41:16 crc kubenswrapper[4594]: I1129 06:41:16.248781 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x47cb/crc-debug-l8nnd" event={"ID":"0761959a-d227-457b-9b4e-b6874b69133b","Type":"ContainerDied","Data":"e552ed58023c95bd0e803b7247089d74cdb3422f2c25f37352589d7c25df73db"} Nov 29 06:41:16 crc kubenswrapper[4594]: I1129 06:41:16.248828 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e552ed58023c95bd0e803b7247089d74cdb3422f2c25f37352589d7c25df73db" Nov 29 06:41:16 crc kubenswrapper[4594]: I1129 06:41:16.248825 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/crc-debug-l8nnd" Nov 29 06:41:16 crc kubenswrapper[4594]: I1129 06:41:16.301335 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x47cb/crc-debug-l8nnd"] Nov 29 06:41:16 crc kubenswrapper[4594]: I1129 06:41:16.309214 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x47cb/crc-debug-l8nnd"] Nov 29 06:41:17 crc kubenswrapper[4594]: I1129 06:41:17.470320 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x47cb/crc-debug-xg48k"] Nov 29 06:41:17 crc kubenswrapper[4594]: E1129 06:41:17.470809 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0761959a-d227-457b-9b4e-b6874b69133b" containerName="container-00" Nov 29 06:41:17 crc kubenswrapper[4594]: I1129 06:41:17.470823 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0761959a-d227-457b-9b4e-b6874b69133b" containerName="container-00" Nov 29 06:41:17 crc kubenswrapper[4594]: I1129 06:41:17.471046 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="0761959a-d227-457b-9b4e-b6874b69133b" containerName="container-00" Nov 29 06:41:17 crc kubenswrapper[4594]: I1129 06:41:17.471856 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/crc-debug-xg48k" Nov 29 06:41:17 crc kubenswrapper[4594]: I1129 06:41:17.589494 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swwgn\" (UniqueName: \"kubernetes.io/projected/f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2-kube-api-access-swwgn\") pod \"crc-debug-xg48k\" (UID: \"f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2\") " pod="openshift-must-gather-x47cb/crc-debug-xg48k" Nov 29 06:41:17 crc kubenswrapper[4594]: I1129 06:41:17.589612 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2-host\") pod \"crc-debug-xg48k\" (UID: \"f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2\") " pod="openshift-must-gather-x47cb/crc-debug-xg48k" Nov 29 06:41:17 crc kubenswrapper[4594]: I1129 06:41:17.694199 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2-host\") pod \"crc-debug-xg48k\" (UID: \"f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2\") " pod="openshift-must-gather-x47cb/crc-debug-xg48k" Nov 29 06:41:17 crc kubenswrapper[4594]: I1129 06:41:17.694364 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2-host\") pod \"crc-debug-xg48k\" (UID: \"f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2\") " pod="openshift-must-gather-x47cb/crc-debug-xg48k" Nov 29 06:41:17 crc kubenswrapper[4594]: I1129 06:41:17.694379 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swwgn\" (UniqueName: \"kubernetes.io/projected/f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2-kube-api-access-swwgn\") pod \"crc-debug-xg48k\" (UID: \"f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2\") " pod="openshift-must-gather-x47cb/crc-debug-xg48k" Nov 29 06:41:17 crc kubenswrapper[4594]: I1129 06:41:17.715441 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swwgn\" (UniqueName: \"kubernetes.io/projected/f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2-kube-api-access-swwgn\") pod \"crc-debug-xg48k\" (UID: \"f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2\") " pod="openshift-must-gather-x47cb/crc-debug-xg48k" Nov 29 06:41:17 crc kubenswrapper[4594]: I1129 06:41:17.790268 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/crc-debug-xg48k" Nov 29 06:41:17 crc kubenswrapper[4594]: W1129 06:41:17.814969 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6f3f238_a3d5_498b_ab8f_c7d3fe6aadb2.slice/crio-d630cff154dcb3f34bf683cadec9ab0104a2c68b0f598f5ebe6c7b38bf243f43 WatchSource:0}: Error finding container d630cff154dcb3f34bf683cadec9ab0104a2c68b0f598f5ebe6c7b38bf243f43: Status 404 returned error can't find the container with id d630cff154dcb3f34bf683cadec9ab0104a2c68b0f598f5ebe6c7b38bf243f43 Nov 29 06:41:18 crc kubenswrapper[4594]: I1129 06:41:18.092925 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0761959a-d227-457b-9b4e-b6874b69133b" path="/var/lib/kubelet/pods/0761959a-d227-457b-9b4e-b6874b69133b/volumes" Nov 29 06:41:18 crc kubenswrapper[4594]: I1129 06:41:18.265201 4594 generic.go:334] "Generic (PLEG): container finished" podID="f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2" containerID="caa8504275463770151df39d92eadf2334c64a234aaedb8a029aefb1f802a142" exitCode=0 Nov 29 06:41:18 crc kubenswrapper[4594]: I1129 06:41:18.265277 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x47cb/crc-debug-xg48k" event={"ID":"f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2","Type":"ContainerDied","Data":"caa8504275463770151df39d92eadf2334c64a234aaedb8a029aefb1f802a142"} Nov 29 06:41:18 crc kubenswrapper[4594]: I1129 06:41:18.265334 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x47cb/crc-debug-xg48k" event={"ID":"f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2","Type":"ContainerStarted","Data":"d630cff154dcb3f34bf683cadec9ab0104a2c68b0f598f5ebe6c7b38bf243f43"} Nov 29 06:41:18 crc kubenswrapper[4594]: I1129 06:41:18.296481 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x47cb/crc-debug-xg48k"] Nov 29 06:41:18 crc kubenswrapper[4594]: I1129 06:41:18.303950 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x47cb/crc-debug-xg48k"] Nov 29 06:41:19 crc kubenswrapper[4594]: I1129 06:41:19.505741 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/crc-debug-xg48k" Nov 29 06:41:19 crc kubenswrapper[4594]: I1129 06:41:19.635170 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swwgn\" (UniqueName: \"kubernetes.io/projected/f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2-kube-api-access-swwgn\") pod \"f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2\" (UID: \"f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2\") " Nov 29 06:41:19 crc kubenswrapper[4594]: I1129 06:41:19.635636 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2-host\") pod \"f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2\" (UID: \"f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2\") " Nov 29 06:41:19 crc kubenswrapper[4594]: I1129 06:41:19.635776 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2-host" (OuterVolumeSpecName: "host") pod "f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2" (UID: "f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 06:41:19 crc kubenswrapper[4594]: I1129 06:41:19.636569 4594 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2-host\") on node \"crc\" DevicePath \"\"" Nov 29 06:41:19 crc kubenswrapper[4594]: I1129 06:41:19.642662 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2-kube-api-access-swwgn" (OuterVolumeSpecName: "kube-api-access-swwgn") pod "f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2" (UID: "f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2"). InnerVolumeSpecName "kube-api-access-swwgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:41:19 crc kubenswrapper[4594]: I1129 06:41:19.738561 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swwgn\" (UniqueName: \"kubernetes.io/projected/f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2-kube-api-access-swwgn\") on node \"crc\" DevicePath \"\"" Nov 29 06:41:20 crc kubenswrapper[4594]: I1129 06:41:20.093167 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2" path="/var/lib/kubelet/pods/f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2/volumes" Nov 29 06:41:20 crc kubenswrapper[4594]: I1129 06:41:20.281431 4594 scope.go:117] "RemoveContainer" containerID="caa8504275463770151df39d92eadf2334c64a234aaedb8a029aefb1f802a142" Nov 29 06:41:20 crc kubenswrapper[4594]: I1129 06:41:20.281701 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/crc-debug-xg48k" Nov 29 06:41:40 crc kubenswrapper[4594]: I1129 06:41:40.588817 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d8fb9b558-k2gdh_81f68040-3d0b-4f18-85fb-3f29b28c8fbe/barbican-api/0.log" Nov 29 06:41:40 crc kubenswrapper[4594]: I1129 06:41:40.658008 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d8fb9b558-k2gdh_81f68040-3d0b-4f18-85fb-3f29b28c8fbe/barbican-api-log/0.log" Nov 29 06:41:40 crc kubenswrapper[4594]: I1129 06:41:40.761440 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c89ff55f4-zl5h6_744bbd71-1ab1-492d-9148-37be600ef9c8/barbican-keystone-listener/0.log" Nov 29 06:41:40 crc kubenswrapper[4594]: I1129 06:41:40.828745 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c89ff55f4-zl5h6_744bbd71-1ab1-492d-9148-37be600ef9c8/barbican-keystone-listener-log/0.log" Nov 29 06:41:40 crc kubenswrapper[4594]: I1129 06:41:40.912928 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-776755c9f7-9ghn5_9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374/barbican-worker/0.log" Nov 29 06:41:40 crc kubenswrapper[4594]: I1129 06:41:40.966478 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-776755c9f7-9ghn5_9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374/barbican-worker-log/0.log" Nov 29 06:41:41 crc kubenswrapper[4594]: I1129 06:41:41.071491 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7_2a01c522-360e-4b2a-8b7e-4e5618fe1541/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:41 crc kubenswrapper[4594]: I1129 06:41:41.184092 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c118ff62-66e3-4359-9122-ebc78d1a1f3d/ceilometer-central-agent/0.log" Nov 29 06:41:41 crc kubenswrapper[4594]: I1129 06:41:41.264886 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c118ff62-66e3-4359-9122-ebc78d1a1f3d/proxy-httpd/0.log" Nov 29 06:41:41 crc kubenswrapper[4594]: I1129 06:41:41.270031 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c118ff62-66e3-4359-9122-ebc78d1a1f3d/ceilometer-notification-agent/0.log" Nov 29 06:41:41 crc kubenswrapper[4594]: I1129 06:41:41.350998 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c118ff62-66e3-4359-9122-ebc78d1a1f3d/sg-core/0.log" Nov 29 06:41:41 crc kubenswrapper[4594]: I1129 06:41:41.469512 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d704d9f4-1a8a-4cc8-af37-371bcc9b254b/cinder-api-log/0.log" Nov 29 06:41:41 crc kubenswrapper[4594]: I1129 06:41:41.648190 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d704d9f4-1a8a-4cc8-af37-371bcc9b254b/cinder-api/0.log" Nov 29 06:41:41 crc kubenswrapper[4594]: I1129 06:41:41.651757 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a816de0b-732c-46f3-ba52-2a7630623d5b/cinder-scheduler/0.log" Nov 29 06:41:41 crc kubenswrapper[4594]: I1129 06:41:41.692850 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a816de0b-732c-46f3-ba52-2a7630623d5b/probe/0.log" Nov 29 06:41:41 crc kubenswrapper[4594]: I1129 06:41:41.803695 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c_e4ac87df-2d62-4571-a38a-a9cd25537685/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:41 crc kubenswrapper[4594]: I1129 06:41:41.892394 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-srf6x_12960501-d688-4937-b0b3-048b780072d3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:41 crc kubenswrapper[4594]: I1129 06:41:41.988565 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-bgkdt_23dcced9-156e-4d68-82c3-43b9b2a0d9be/init/0.log" Nov 29 06:41:42 crc kubenswrapper[4594]: I1129 06:41:42.125411 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-bgkdt_23dcced9-156e-4d68-82c3-43b9b2a0d9be/init/0.log" Nov 29 06:41:42 crc kubenswrapper[4594]: I1129 06:41:42.256561 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8fq52_6345164d-91bd-47df-b5a6-71f9940c0f15/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:42 crc kubenswrapper[4594]: I1129 06:41:42.290057 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-bgkdt_23dcced9-156e-4d68-82c3-43b9b2a0d9be/dnsmasq-dns/0.log" Nov 29 06:41:42 crc kubenswrapper[4594]: I1129 06:41:42.422693 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bfda0b74-99d7-4176-89f9-71d8385ddc6f/glance-log/0.log" Nov 29 06:41:42 crc kubenswrapper[4594]: I1129 06:41:42.444160 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bfda0b74-99d7-4176-89f9-71d8385ddc6f/glance-httpd/0.log" Nov 29 06:41:42 crc kubenswrapper[4594]: I1129 06:41:42.574557 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9f4133eb-5349-4bad-a993-d4e880a2f1be/glance-httpd/0.log" Nov 29 06:41:42 crc kubenswrapper[4594]: I1129 06:41:42.609854 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9f4133eb-5349-4bad-a993-d4e880a2f1be/glance-log/0.log" Nov 29 06:41:42 crc kubenswrapper[4594]: I1129 06:41:42.849015 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-558d4b85cb-k5j98_19dfde1f-d770-45ec-8735-78549b8fcb90/horizon/0.log" Nov 29 06:41:42 crc kubenswrapper[4594]: I1129 06:41:42.856950 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-tclxx_9dafdf20-2acb-46ad-adb3-d1421087ca5e/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:43 crc kubenswrapper[4594]: I1129 06:41:43.135167 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-qthxb_692e39b7-fc9f-4770-847e-ff968ddf1ad8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:43 crc kubenswrapper[4594]: I1129 06:41:43.213666 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-558d4b85cb-k5j98_19dfde1f-d770-45ec-8735-78549b8fcb90/horizon-log/0.log" Nov 29 06:41:43 crc kubenswrapper[4594]: I1129 06:41:43.360079 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29406601-5wlh9_45955c1f-8326-47e3-ba4b-3c6ea134e496/keystone-cron/0.log" Nov 29 06:41:43 crc kubenswrapper[4594]: I1129 06:41:43.567369 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6a4026a4-228e-4aa5-be23-c9b7e203c011/kube-state-metrics/0.log" Nov 29 06:41:43 crc kubenswrapper[4594]: I1129 06:41:43.637574 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-87b885ff4-zwt2r_2e56e5f5-25c3-4bbb-a9ca-47aec5d22564/keystone-api/0.log" Nov 29 06:41:43 crc kubenswrapper[4594]: I1129 06:41:43.756817 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc_ec93e11c-8754-4e8c-8e75-d563fb7cef1f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:44 crc kubenswrapper[4594]: I1129 06:41:44.095817 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c7df66d69-hd8nh_23f6e7de-b25b-4522-8368-cd17f44dc109/neutron-httpd/0.log" Nov 29 06:41:44 crc kubenswrapper[4594]: I1129 06:41:44.121635 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c7df66d69-hd8nh_23f6e7de-b25b-4522-8368-cd17f44dc109/neutron-api/0.log" Nov 29 06:41:44 crc kubenswrapper[4594]: I1129 06:41:44.275570 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx_a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:44 crc kubenswrapper[4594]: I1129 06:41:44.906677 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_81a5ed74-39d5-4b41-8083-04c1a6f6f119/nova-cell0-conductor-conductor/0.log" Nov 29 06:41:45 crc kubenswrapper[4594]: I1129 06:41:45.093571 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f69f0fb6-b307-4c01-a90e-edf23e3858e1/nova-cell1-conductor-conductor/0.log" Nov 29 06:41:45 crc kubenswrapper[4594]: I1129 06:41:45.449262 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9cbd6039-37fe-4ad5-9149-441d6e5d1812/nova-cell1-novncproxy-novncproxy/0.log" Nov 29 06:41:45 crc kubenswrapper[4594]: I1129 06:41:45.641045 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-txzng_e5b69862-fd4c-4f01-977a-3d7f9bcce932/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:45 crc kubenswrapper[4594]: I1129 06:41:45.799491 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e/nova-api-log/0.log" Nov 29 06:41:45 crc kubenswrapper[4594]: I1129 06:41:45.972469 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7c168bec-2ad5-431a-ad8e-ef04de7635b4/nova-metadata-log/0.log" Nov 29 06:41:45 crc kubenswrapper[4594]: I1129 06:41:45.994913 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e/nova-api-api/0.log" Nov 29 06:41:46 crc kubenswrapper[4594]: I1129 06:41:46.220806 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1ab0ecb8-fc35-4934-b62a-6912d56e9001/mysql-bootstrap/0.log" Nov 29 06:41:46 crc kubenswrapper[4594]: I1129 06:41:46.423509 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ae32608f-19a0-4825-8fab-36c89e217b50/nova-scheduler-scheduler/0.log" Nov 29 06:41:46 crc kubenswrapper[4594]: I1129 06:41:46.859492 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1ab0ecb8-fc35-4934-b62a-6912d56e9001/mysql-bootstrap/0.log" Nov 29 06:41:46 crc kubenswrapper[4594]: I1129 06:41:46.921421 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1ab0ecb8-fc35-4934-b62a-6912d56e9001/galera/0.log" Nov 29 06:41:47 crc kubenswrapper[4594]: I1129 06:41:47.050645 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_caf7dae9-7fe9-4cf5-a5d0-39122397592e/mysql-bootstrap/0.log" Nov 29 06:41:47 crc kubenswrapper[4594]: I1129 06:41:47.273997 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_caf7dae9-7fe9-4cf5-a5d0-39122397592e/mysql-bootstrap/0.log" Nov 29 06:41:47 crc kubenswrapper[4594]: I1129 06:41:47.289567 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_caf7dae9-7fe9-4cf5-a5d0-39122397592e/galera/0.log" Nov 29 06:41:47 crc kubenswrapper[4594]: I1129 06:41:47.490221 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3629612b-cfc5-42bc-8584-4abc21ce4b3f/openstackclient/0.log" Nov 29 06:41:47 crc kubenswrapper[4594]: I1129 06:41:47.564410 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7c168bec-2ad5-431a-ad8e-ef04de7635b4/nova-metadata-metadata/0.log" Nov 29 06:41:47 crc kubenswrapper[4594]: I1129 06:41:47.711924 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-b9h9k_a7baec31-60ce-4be4-8901-a8cbe7bf7ea9/ovn-controller/0.log" Nov 29 06:41:47 crc kubenswrapper[4594]: I1129 06:41:47.761555 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-qlxwj_a2d82ed9-466f-47e4-973d-0e88270f1021/openstack-network-exporter/0.log" Nov 29 06:41:47 crc kubenswrapper[4594]: I1129 06:41:47.943569 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvkw9_0b167600-fe30-4499-876c-57685a803c45/ovsdb-server-init/0.log" Nov 29 06:41:48 crc kubenswrapper[4594]: I1129 06:41:48.073405 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvkw9_0b167600-fe30-4499-876c-57685a803c45/ovsdb-server-init/0.log" Nov 29 06:41:48 crc kubenswrapper[4594]: I1129 06:41:48.076676 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvkw9_0b167600-fe30-4499-876c-57685a803c45/ovsdb-server/0.log" Nov 29 06:41:48 crc kubenswrapper[4594]: I1129 06:41:48.381857 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvkw9_0b167600-fe30-4499-876c-57685a803c45/ovs-vswitchd/0.log" Nov 29 06:41:48 crc kubenswrapper[4594]: I1129 06:41:48.802644 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-rgrwm_3cd5d33f-c80d-49d9-97b7-26dc98be7fa7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:48 crc kubenswrapper[4594]: I1129 06:41:48.880107 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_21284ba8-9492-4f6e-84c8-88d3844f386b/openstack-network-exporter/0.log" Nov 29 06:41:49 crc kubenswrapper[4594]: I1129 06:41:49.061584 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_21284ba8-9492-4f6e-84c8-88d3844f386b/ovn-northd/0.log" Nov 29 06:41:49 crc kubenswrapper[4594]: I1129 06:41:49.073838 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_31fad155-3970-4a5d-a357-e96fa27bbb54/openstack-network-exporter/0.log" Nov 29 06:41:49 crc kubenswrapper[4594]: I1129 06:41:49.099935 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_31fad155-3970-4a5d-a357-e96fa27bbb54/ovsdbserver-nb/0.log" Nov 29 06:41:49 crc kubenswrapper[4594]: I1129 06:41:49.247792 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4f5075e0-210c-455f-8203-3dde7c7be5eb/openstack-network-exporter/0.log" Nov 29 06:41:49 crc kubenswrapper[4594]: I1129 06:41:49.278972 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4f5075e0-210c-455f-8203-3dde7c7be5eb/ovsdbserver-sb/0.log" Nov 29 06:41:49 crc kubenswrapper[4594]: I1129 06:41:49.556048 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83de9c5a-c56c-4433-9a32-48972cdd1b46/init-config-reloader/0.log" Nov 29 06:41:49 crc kubenswrapper[4594]: I1129 06:41:49.603988 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55c5775b88-9dvcb_95ecbb59-1c3f-4561-a175-ffbd99d0496f/placement-api/0.log" Nov 29 06:41:49 crc kubenswrapper[4594]: I1129 06:41:49.703349 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55c5775b88-9dvcb_95ecbb59-1c3f-4561-a175-ffbd99d0496f/placement-log/0.log" Nov 29 06:41:49 crc kubenswrapper[4594]: I1129 06:41:49.706052 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83de9c5a-c56c-4433-9a32-48972cdd1b46/init-config-reloader/0.log" Nov 29 06:41:49 crc kubenswrapper[4594]: I1129 06:41:49.745342 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83de9c5a-c56c-4433-9a32-48972cdd1b46/config-reloader/0.log" Nov 29 06:41:49 crc kubenswrapper[4594]: I1129 06:41:49.796885 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83de9c5a-c56c-4433-9a32-48972cdd1b46/prometheus/0.log" Nov 29 06:41:49 crc kubenswrapper[4594]: I1129 06:41:49.910672 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83de9c5a-c56c-4433-9a32-48972cdd1b46/thanos-sidecar/0.log" Nov 29 06:41:49 crc kubenswrapper[4594]: I1129 06:41:49.963314 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3dcf9a3e-9869-4630-a695-c180db93aca7/setup-container/0.log" Nov 29 06:41:50 crc kubenswrapper[4594]: I1129 06:41:50.149964 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3dcf9a3e-9869-4630-a695-c180db93aca7/rabbitmq/0.log" Nov 29 06:41:50 crc kubenswrapper[4594]: I1129 06:41:50.216278 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3dcf9a3e-9869-4630-a695-c180db93aca7/setup-container/0.log" Nov 29 06:41:50 crc kubenswrapper[4594]: I1129 06:41:50.220638 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_9667e68c-f715-4663-bddb-53c53d3a593d/setup-container/0.log" Nov 29 06:41:50 crc kubenswrapper[4594]: I1129 06:41:50.371851 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_9667e68c-f715-4663-bddb-53c53d3a593d/setup-container/0.log" Nov 29 06:41:50 crc kubenswrapper[4594]: I1129 06:41:50.423682 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_9667e68c-f715-4663-bddb-53c53d3a593d/rabbitmq/0.log" Nov 29 06:41:50 crc kubenswrapper[4594]: I1129 06:41:50.457361 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bb17ce90-d0e2-4a46-905b-e27bff2295fb/setup-container/0.log" Nov 29 06:41:50 crc kubenswrapper[4594]: I1129 06:41:50.639072 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bb17ce90-d0e2-4a46-905b-e27bff2295fb/setup-container/0.log" Nov 29 06:41:50 crc kubenswrapper[4594]: I1129 06:41:50.687581 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp_41f4ce3b-5711-4b51-a22d-5fcbda6153ac/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:50 crc kubenswrapper[4594]: I1129 06:41:50.706201 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bb17ce90-d0e2-4a46-905b-e27bff2295fb/rabbitmq/0.log" Nov 29 06:41:50 crc kubenswrapper[4594]: I1129 06:41:50.880451 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-7cxvz_db8e580b-8fbe-4e91-bb94-023bf1b2903b/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:50 crc kubenswrapper[4594]: I1129 06:41:50.908580 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb_b16b2100-7eea-43dd-8b1c-f2c337bdb3bd/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:51 crc kubenswrapper[4594]: I1129 06:41:51.119624 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-kxwqm_795787e2-6c07-4e76-98ae-38a13aae294a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:51 crc kubenswrapper[4594]: I1129 06:41:51.181937 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-6gzlk_fad852f8-f0f3-4fa6-9196-58c24259a3a6/ssh-known-hosts-edpm-deployment/0.log" Nov 29 06:41:51 crc kubenswrapper[4594]: I1129 06:41:51.354572 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d689f55f9-c4bt7_bb181390-82bf-4bd9-9063-9272988db515/proxy-server/0.log" Nov 29 06:41:51 crc kubenswrapper[4594]: I1129 06:41:51.492013 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-wtghb_f63cf943-e9c0-4f70-9f7b-8ecf859c92ae/swift-ring-rebalance/0.log" Nov 29 06:41:51 crc kubenswrapper[4594]: I1129 06:41:51.495557 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d689f55f9-c4bt7_bb181390-82bf-4bd9-9063-9272988db515/proxy-httpd/0.log" Nov 29 06:41:51 crc kubenswrapper[4594]: I1129 06:41:51.626740 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/account-auditor/0.log" Nov 29 06:41:51 crc kubenswrapper[4594]: I1129 06:41:51.710800 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/account-reaper/0.log" Nov 29 06:41:51 crc kubenswrapper[4594]: I1129 06:41:51.759552 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/account-replicator/0.log" Nov 29 06:41:51 crc kubenswrapper[4594]: I1129 06:41:51.795481 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/account-server/0.log" Nov 29 06:41:51 crc kubenswrapper[4594]: I1129 06:41:51.842723 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/container-auditor/0.log" Nov 29 06:41:51 crc kubenswrapper[4594]: I1129 06:41:51.975704 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/container-replicator/0.log" Nov 29 06:41:51 crc kubenswrapper[4594]: I1129 06:41:51.980399 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/container-server/0.log" Nov 29 06:41:52 crc kubenswrapper[4594]: I1129 06:41:52.009398 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/container-updater/0.log" Nov 29 06:41:52 crc kubenswrapper[4594]: I1129 06:41:52.034112 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/object-auditor/0.log" Nov 29 06:41:52 crc kubenswrapper[4594]: I1129 06:41:52.144885 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/object-expirer/0.log" Nov 29 06:41:52 crc kubenswrapper[4594]: I1129 06:41:52.173602 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/object-replicator/0.log" Nov 29 06:41:52 crc kubenswrapper[4594]: I1129 06:41:52.196733 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/object-server/0.log" Nov 29 06:41:52 crc kubenswrapper[4594]: I1129 06:41:52.266325 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/object-updater/0.log" Nov 29 06:41:52 crc kubenswrapper[4594]: I1129 06:41:52.375650 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/rsync/0.log" Nov 29 06:41:52 crc kubenswrapper[4594]: I1129 06:41:52.389067 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/swift-recon-cron/0.log" Nov 29 06:41:52 crc kubenswrapper[4594]: I1129 06:41:52.581744 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg_2e564f32-c761-4816-9715-7636294bd4c4/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:52 crc kubenswrapper[4594]: I1129 06:41:52.671674 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad/tempest-tests-tempest-tests-runner/0.log" Nov 29 06:41:52 crc kubenswrapper[4594]: I1129 06:41:52.773850 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_ceac441b-e510-49a4-aefb-26fad57552d2/test-operator-logs-container/0.log" Nov 29 06:41:52 crc kubenswrapper[4594]: I1129 06:41:52.878232 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh_627ba28b-319a-4072-bd92-d9b1a9e77283/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:41:53 crc kubenswrapper[4594]: I1129 06:41:53.712876 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_cfbd6de3-fc8d-4d93-a76d-fd2b8a196167/watcher-applier/0.log" Nov 29 06:41:54 crc kubenswrapper[4594]: I1129 06:41:54.448171 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_4082da7a-cc96-4b67-a101-48600e49712b/watcher-api-log/0.log" Nov 29 06:41:56 crc kubenswrapper[4594]: I1129 06:41:56.749210 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_716cbd33-cb95-4be2-a9c9-98c742ee4e17/watcher-decision-engine/0.log" Nov 29 06:41:57 crc kubenswrapper[4594]: I1129 06:41:57.183850 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_9952b1d2-2cce-45f4-b370-5d0107f80260/memcached/0.log" Nov 29 06:41:57 crc kubenswrapper[4594]: I1129 06:41:57.298246 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_4082da7a-cc96-4b67-a101-48600e49712b/watcher-api/0.log" Nov 29 06:42:18 crc kubenswrapper[4594]: I1129 06:42:18.577833 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/util/0.log" Nov 29 06:42:19 crc kubenswrapper[4594]: I1129 06:42:19.239439 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/pull/0.log" Nov 29 06:42:19 crc kubenswrapper[4594]: I1129 06:42:19.248224 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/pull/0.log" Nov 29 06:42:19 crc kubenswrapper[4594]: I1129 06:42:19.273251 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/util/0.log" Nov 29 06:42:19 crc kubenswrapper[4594]: I1129 06:42:19.428341 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/pull/0.log" Nov 29 06:42:19 crc kubenswrapper[4594]: I1129 06:42:19.438050 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/util/0.log" Nov 29 06:42:19 crc kubenswrapper[4594]: I1129 06:42:19.456842 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/extract/0.log" Nov 29 06:42:19 crc kubenswrapper[4594]: I1129 06:42:19.645771 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jc4r5_8c96e02a-d3bd-4904-8ade-baecb4c3a280/kube-rbac-proxy/0.log" Nov 29 06:42:19 crc kubenswrapper[4594]: I1129 06:42:19.655180 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jc4r5_8c96e02a-d3bd-4904-8ade-baecb4c3a280/manager/0.log" Nov 29 06:42:19 crc kubenswrapper[4594]: I1129 06:42:19.658314 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-xlmc8_f50eb91c-7f6c-4d0f-b32a-c9ead1766b9c/kube-rbac-proxy/0.log" Nov 29 06:42:19 crc kubenswrapper[4594]: I1129 06:42:19.798276 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-xlmc8_f50eb91c-7f6c-4d0f-b32a-c9ead1766b9c/manager/0.log" Nov 29 06:42:19 crc kubenswrapper[4594]: I1129 06:42:19.823911 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-g6nl9_95648e23-46bf-4160-9527-7ad1c84f9883/manager/0.log" Nov 29 06:42:19 crc kubenswrapper[4594]: I1129 06:42:19.834445 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-g6nl9_95648e23-46bf-4160-9527-7ad1c84f9883/kube-rbac-proxy/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.009636 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-s254f_e564c6c7-7145-411f-b48f-d8e2594c34a5/kube-rbac-proxy/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.047158 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-s254f_e564c6c7-7145-411f-b48f-d8e2594c34a5/manager/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.145166 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-mcfvv_01841a60-a638-4a78-84d4-01ad474bf2fb/kube-rbac-proxy/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.176397 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-mcfvv_01841a60-a638-4a78-84d4-01ad474bf2fb/manager/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.229620 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-z9q2v_09d8a0a7-cc55-4654-8e59-a769c806eecf/kube-rbac-proxy/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.315026 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-z9q2v_09d8a0a7-cc55-4654-8e59-a769c806eecf/manager/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.379128 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-6c7gr_0ee70d2e-b283-468b-8bd8-016a120b5ae8/kube-rbac-proxy/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.546797 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-qgzzn_fb42365c-18e1-4456-ae16-be77a16f102c/manager/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.557282 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-qgzzn_fb42365c-18e1-4456-ae16-be77a16f102c/kube-rbac-proxy/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.591580 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-6c7gr_0ee70d2e-b283-468b-8bd8-016a120b5ae8/manager/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.710385 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-ppdt9_fbdf482e-3aa0-4f5c-a698-949ad6cb6992/kube-rbac-proxy/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.812590 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-ppdt9_fbdf482e-3aa0-4f5c-a698-949ad6cb6992/manager/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.890217 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-5bng6_61fc53be-18c3-48bb-9a4a-2557df78afc7/manager/0.log" Nov 29 06:42:20 crc kubenswrapper[4594]: I1129 06:42:20.895292 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-5bng6_61fc53be-18c3-48bb-9a4a-2557df78afc7/kube-rbac-proxy/0.log" Nov 29 06:42:21 crc kubenswrapper[4594]: I1129 06:42:21.023012 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-48tfk_a1b0453c-c84d-45ea-90be-7f01a831f987/kube-rbac-proxy/0.log" Nov 29 06:42:21 crc kubenswrapper[4594]: I1129 06:42:21.116742 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-48tfk_a1b0453c-c84d-45ea-90be-7f01a831f987/manager/0.log" Nov 29 06:42:21 crc kubenswrapper[4594]: I1129 06:42:21.150490 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-d84hx_2de2c23f-39bd-4a9d-9965-7fe280b61707/kube-rbac-proxy/0.log" Nov 29 06:42:21 crc kubenswrapper[4594]: I1129 06:42:21.239064 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-d84hx_2de2c23f-39bd-4a9d-9965-7fe280b61707/manager/0.log" Nov 29 06:42:21 crc kubenswrapper[4594]: I1129 06:42:21.275276 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7fmfc_7d8912e8-8f81-4ea6-94c2-7e56c7726e58/kube-rbac-proxy/0.log" Nov 29 06:42:21 crc kubenswrapper[4594]: I1129 06:42:21.368798 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7fmfc_7d8912e8-8f81-4ea6-94c2-7e56c7726e58/manager/0.log" Nov 29 06:42:21 crc kubenswrapper[4594]: I1129 06:42:21.434830 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-245g8_f9ab855a-f938-4ad6-941a-52f4e5b7d4b2/kube-rbac-proxy/0.log" Nov 29 06:42:21 crc kubenswrapper[4594]: I1129 06:42:21.442941 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-245g8_f9ab855a-f938-4ad6-941a-52f4e5b7d4b2/manager/0.log" Nov 29 06:42:21 crc kubenswrapper[4594]: I1129 06:42:21.532176 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6698bcb446zrvcn_82d8e084-bca0-43f0-9d6f-63df84cd28a6/kube-rbac-proxy/0.log" Nov 29 06:42:21 crc kubenswrapper[4594]: I1129 06:42:21.600009 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6698bcb446zrvcn_82d8e084-bca0-43f0-9d6f-63df84cd28a6/manager/0.log" Nov 29 06:42:21 crc kubenswrapper[4594]: I1129 06:42:21.864493 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6ddddd9d6f-7q4j2_880b9d6a-5dc6-448b-a63c-b098fcc54023/operator/0.log" Nov 29 06:42:21 crc kubenswrapper[4594]: I1129 06:42:21.950333 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zmbtt_9d8ef423-1563-4fda-92d3-dcbd15f10b13/registry-server/0.log" Nov 29 06:42:22 crc kubenswrapper[4594]: I1129 06:42:22.020534 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-pclb6_460651bc-3f62-4bc6-ab53-e791ea16993e/kube-rbac-proxy/0.log" Nov 29 06:42:22 crc kubenswrapper[4594]: I1129 06:42:22.176663 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-7n6mf_5cde207a-7c2e-46d9-809a-72b8749560a6/kube-rbac-proxy/0.log" Nov 29 06:42:22 crc kubenswrapper[4594]: I1129 06:42:22.186985 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-pclb6_460651bc-3f62-4bc6-ab53-e791ea16993e/manager/0.log" Nov 29 06:42:22 crc kubenswrapper[4594]: I1129 06:42:22.255751 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-7n6mf_5cde207a-7c2e-46d9-809a-72b8749560a6/manager/0.log" Nov 29 06:42:22 crc kubenswrapper[4594]: I1129 06:42:22.387022 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4mk5r_3fdff03c-acc7-4274-bb06-83abd0f7b432/operator/0.log" Nov 29 06:42:22 crc kubenswrapper[4594]: I1129 06:42:22.485534 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-dd8x6_d8cfbdeb-cd3d-425e-a96c-d9a565c840c3/kube-rbac-proxy/0.log" Nov 29 06:42:22 crc kubenswrapper[4594]: I1129 06:42:22.600959 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-dd8x6_d8cfbdeb-cd3d-425e-a96c-d9a565c840c3/manager/0.log" Nov 29 06:42:22 crc kubenswrapper[4594]: I1129 06:42:22.675068 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-zccgx_b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0/kube-rbac-proxy/0.log" Nov 29 06:42:22 crc kubenswrapper[4594]: I1129 06:42:22.791542 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-r28h2_9e1ee1b6-8684-4fcb-a26c-fd85a950abcc/kube-rbac-proxy/0.log" Nov 29 06:42:22 crc kubenswrapper[4594]: I1129 06:42:22.893397 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-r28h2_9e1ee1b6-8684-4fcb-a26c-fd85a950abcc/manager/0.log" Nov 29 06:42:22 crc kubenswrapper[4594]: I1129 06:42:22.904484 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-zccgx_b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0/manager/0.log" Nov 29 06:42:23 crc kubenswrapper[4594]: I1129 06:42:23.033213 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-6gxvv_8a5e35f1-00df-4307-b197-f7800c641af7/kube-rbac-proxy/0.log" Nov 29 06:42:23 crc kubenswrapper[4594]: I1129 06:42:23.051421 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-656fd97d56-fcmzw_505c6e79-1776-4995-a6b5-5888f75c141c/manager/0.log" Nov 29 06:42:23 crc kubenswrapper[4594]: I1129 06:42:23.162556 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-6gxvv_8a5e35f1-00df-4307-b197-f7800c641af7/manager/0.log" Nov 29 06:42:41 crc kubenswrapper[4594]: I1129 06:42:41.030662 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gspdt_1fea2129-7ad0-45d8-9447-315107ef1c0c/control-plane-machine-set-operator/0.log" Nov 29 06:42:41 crc kubenswrapper[4594]: I1129 06:42:41.180377 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f9dqp_09980e91-b2e4-4a0e-bee7-dc101096f804/kube-rbac-proxy/0.log" Nov 29 06:42:41 crc kubenswrapper[4594]: I1129 06:42:41.233074 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f9dqp_09980e91-b2e4-4a0e-bee7-dc101096f804/machine-api-operator/0.log" Nov 29 06:42:53 crc kubenswrapper[4594]: I1129 06:42:53.023775 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-4z2w6_edf7ce03-4be9-42d1-8a58-e0f132c43299/cert-manager-controller/0.log" Nov 29 06:42:53 crc kubenswrapper[4594]: I1129 06:42:53.240633 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-4srms_f288eb1f-58a0-4e9b-9b63-ba15bedb38ec/cert-manager-webhook/0.log" Nov 29 06:42:53 crc kubenswrapper[4594]: I1129 06:42:53.243413 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-btrfm_f4838603-21ba-451a-a9d4-d5415bc4b52a/cert-manager-cainjector/0.log" Nov 29 06:43:05 crc kubenswrapper[4594]: I1129 06:43:05.528367 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-kj52s_060d894f-3918-4f8c-8b70-33d7e18b316d/nmstate-console-plugin/0.log" Nov 29 06:43:05 crc kubenswrapper[4594]: I1129 06:43:05.623382 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-szl2t_92357f02-ea29-48b6-b763-f6e1b8ca3457/nmstate-handler/0.log" Nov 29 06:43:05 crc kubenswrapper[4594]: I1129 06:43:05.683770 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lp45r_4eb3a18a-b943-4996-8977-8c442eca7e9e/kube-rbac-proxy/0.log" Nov 29 06:43:05 crc kubenswrapper[4594]: I1129 06:43:05.701543 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lp45r_4eb3a18a-b943-4996-8977-8c442eca7e9e/nmstate-metrics/0.log" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.058014 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-gp4lk_ae0a89d0-92c1-4884-a38f-34cf97da3de5/nmstate-operator/0.log" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.093792 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-7g6km_75ccbb22-479b-4415-aa0d-c00853a463ee/nmstate-webhook/0.log" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.789479 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mvr2p"] Nov 29 06:43:06 crc kubenswrapper[4594]: E1129 06:43:06.790282 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2" containerName="container-00" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.790299 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2" containerName="container-00" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.790509 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f3f238-a3d5-498b-ab8f-c7d3fe6aadb2" containerName="container-00" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.791970 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.801548 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvr2p"] Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.815892 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n67tk\" (UniqueName: \"kubernetes.io/projected/af2678ef-1558-488d-b708-027b67179781-kube-api-access-n67tk\") pod \"redhat-operators-mvr2p\" (UID: \"af2678ef-1558-488d-b708-027b67179781\") " pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.815927 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af2678ef-1558-488d-b708-027b67179781-utilities\") pod \"redhat-operators-mvr2p\" (UID: \"af2678ef-1558-488d-b708-027b67179781\") " pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.816100 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af2678ef-1558-488d-b708-027b67179781-catalog-content\") pod \"redhat-operators-mvr2p\" (UID: \"af2678ef-1558-488d-b708-027b67179781\") " pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.918360 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n67tk\" (UniqueName: \"kubernetes.io/projected/af2678ef-1558-488d-b708-027b67179781-kube-api-access-n67tk\") pod \"redhat-operators-mvr2p\" (UID: \"af2678ef-1558-488d-b708-027b67179781\") " pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.918406 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af2678ef-1558-488d-b708-027b67179781-utilities\") pod \"redhat-operators-mvr2p\" (UID: \"af2678ef-1558-488d-b708-027b67179781\") " pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.918937 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af2678ef-1558-488d-b708-027b67179781-utilities\") pod \"redhat-operators-mvr2p\" (UID: \"af2678ef-1558-488d-b708-027b67179781\") " pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.918981 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af2678ef-1558-488d-b708-027b67179781-catalog-content\") pod \"redhat-operators-mvr2p\" (UID: \"af2678ef-1558-488d-b708-027b67179781\") " pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.918997 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af2678ef-1558-488d-b708-027b67179781-catalog-content\") pod \"redhat-operators-mvr2p\" (UID: \"af2678ef-1558-488d-b708-027b67179781\") " pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:06 crc kubenswrapper[4594]: I1129 06:43:06.935569 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n67tk\" (UniqueName: \"kubernetes.io/projected/af2678ef-1558-488d-b708-027b67179781-kube-api-access-n67tk\") pod \"redhat-operators-mvr2p\" (UID: \"af2678ef-1558-488d-b708-027b67179781\") " pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:07 crc kubenswrapper[4594]: I1129 06:43:07.116356 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:07 crc kubenswrapper[4594]: I1129 06:43:07.552727 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvr2p"] Nov 29 06:43:08 crc kubenswrapper[4594]: I1129 06:43:08.281010 4594 generic.go:334] "Generic (PLEG): container finished" podID="af2678ef-1558-488d-b708-027b67179781" containerID="68a0b63e7adbbad1ca8a375697b817c9187bd9857197b9f4f7939ff237eaf3ed" exitCode=0 Nov 29 06:43:08 crc kubenswrapper[4594]: I1129 06:43:08.281561 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvr2p" event={"ID":"af2678ef-1558-488d-b708-027b67179781","Type":"ContainerDied","Data":"68a0b63e7adbbad1ca8a375697b817c9187bd9857197b9f4f7939ff237eaf3ed"} Nov 29 06:43:08 crc kubenswrapper[4594]: I1129 06:43:08.281611 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvr2p" event={"ID":"af2678ef-1558-488d-b708-027b67179781","Type":"ContainerStarted","Data":"679eedaba3a3f008912224023c8f8a5ff350c5dfb0f317e0bcaaa56e51349b82"} Nov 29 06:43:10 crc kubenswrapper[4594]: I1129 06:43:10.302480 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvr2p" event={"ID":"af2678ef-1558-488d-b708-027b67179781","Type":"ContainerStarted","Data":"faac9117db48b501c37a58d90d5a20537bf0a486601dba1faa5bf692645062ca"} Nov 29 06:43:13 crc kubenswrapper[4594]: I1129 06:43:13.329534 4594 generic.go:334] "Generic (PLEG): container finished" podID="af2678ef-1558-488d-b708-027b67179781" containerID="faac9117db48b501c37a58d90d5a20537bf0a486601dba1faa5bf692645062ca" exitCode=0 Nov 29 06:43:13 crc kubenswrapper[4594]: I1129 06:43:13.329630 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvr2p" event={"ID":"af2678ef-1558-488d-b708-027b67179781","Type":"ContainerDied","Data":"faac9117db48b501c37a58d90d5a20537bf0a486601dba1faa5bf692645062ca"} Nov 29 06:43:14 crc kubenswrapper[4594]: I1129 06:43:14.341308 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvr2p" event={"ID":"af2678ef-1558-488d-b708-027b67179781","Type":"ContainerStarted","Data":"65e27d051a15724e12dc2569f10768adfdb2fc04fc30ab289a3f5a127315171b"} Nov 29 06:43:14 crc kubenswrapper[4594]: I1129 06:43:14.359755 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mvr2p" podStartSLOduration=2.771311944 podStartE2EDuration="8.359740833s" podCreationTimestamp="2025-11-29 06:43:06 +0000 UTC" firstStartedPulling="2025-11-29 06:43:08.284336291 +0000 UTC m=+4512.524845511" lastFinishedPulling="2025-11-29 06:43:13.87276518 +0000 UTC m=+4518.113274400" observedRunningTime="2025-11-29 06:43:14.354285177 +0000 UTC m=+4518.594794397" watchObservedRunningTime="2025-11-29 06:43:14.359740833 +0000 UTC m=+4518.600250053" Nov 29 06:43:15 crc kubenswrapper[4594]: I1129 06:43:15.800229 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:43:15 crc kubenswrapper[4594]: I1129 06:43:15.800344 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:43:17 crc kubenswrapper[4594]: I1129 06:43:17.116986 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:17 crc kubenswrapper[4594]: I1129 06:43:17.117389 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:18 crc kubenswrapper[4594]: I1129 06:43:18.162065 4594 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mvr2p" podUID="af2678ef-1558-488d-b708-027b67179781" containerName="registry-server" probeResult="failure" output=< Nov 29 06:43:18 crc kubenswrapper[4594]: timeout: failed to connect service ":50051" within 1s Nov 29 06:43:18 crc kubenswrapper[4594]: > Nov 29 06:43:20 crc kubenswrapper[4594]: I1129 06:43:20.645461 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-44rgs_c67ed7c7-a9cb-4068-80af-9356fd171e31/kube-rbac-proxy/0.log" Nov 29 06:43:20 crc kubenswrapper[4594]: I1129 06:43:20.754394 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-44rgs_c67ed7c7-a9cb-4068-80af-9356fd171e31/controller/0.log" Nov 29 06:43:20 crc kubenswrapper[4594]: I1129 06:43:20.851093 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-frr-files/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.173592 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-reloader/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.217311 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-frr-files/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.230417 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-reloader/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.266492 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-metrics/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.461082 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-frr-files/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.467492 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-metrics/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.477343 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-reloader/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.542848 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-metrics/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.686187 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-metrics/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.686284 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-reloader/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.689709 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-frr-files/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.744990 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/controller/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.847330 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/frr-metrics/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.882499 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/kube-rbac-proxy/0.log" Nov 29 06:43:21 crc kubenswrapper[4594]: I1129 06:43:21.934717 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/kube-rbac-proxy-frr/0.log" Nov 29 06:43:22 crc kubenswrapper[4594]: I1129 06:43:22.033946 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/reloader/0.log" Nov 29 06:43:22 crc kubenswrapper[4594]: I1129 06:43:22.176444 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-9bwd7_69e15172-74bf-4295-a7a9-a7843b1da728/frr-k8s-webhook-server/0.log" Nov 29 06:43:22 crc kubenswrapper[4594]: I1129 06:43:22.394067 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-79b995c45-klk7s_a1369887-80c2-44ef-b566-f30184ea9607/manager/0.log" Nov 29 06:43:22 crc kubenswrapper[4594]: I1129 06:43:22.538966 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7667cbc88-mqfqp_ce5a6997-d8a6-489f-9bcf-d77879d7ad46/webhook-server/0.log" Nov 29 06:43:22 crc kubenswrapper[4594]: I1129 06:43:22.700954 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5fh4t_a2f948fa-edac-4ac6-9ffb-e5ee886f8164/kube-rbac-proxy/0.log" Nov 29 06:43:23 crc kubenswrapper[4594]: I1129 06:43:23.233051 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5fh4t_a2f948fa-edac-4ac6-9ffb-e5ee886f8164/speaker/0.log" Nov 29 06:43:23 crc kubenswrapper[4594]: I1129 06:43:23.302453 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/frr/0.log" Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.274719 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j6kbf"] Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.279198 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.287886 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c26209-ccf9-4539-a445-e8ee5dff3a2a-catalog-content\") pod \"redhat-marketplace-j6kbf\" (UID: \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\") " pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.287961 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkfcw\" (UniqueName: \"kubernetes.io/projected/99c26209-ccf9-4539-a445-e8ee5dff3a2a-kube-api-access-wkfcw\") pod \"redhat-marketplace-j6kbf\" (UID: \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\") " pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.288147 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c26209-ccf9-4539-a445-e8ee5dff3a2a-utilities\") pod \"redhat-marketplace-j6kbf\" (UID: \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\") " pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.292815 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6kbf"] Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.389858 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c26209-ccf9-4539-a445-e8ee5dff3a2a-utilities\") pod \"redhat-marketplace-j6kbf\" (UID: \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\") " pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.390287 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c26209-ccf9-4539-a445-e8ee5dff3a2a-catalog-content\") pod \"redhat-marketplace-j6kbf\" (UID: \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\") " pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.390508 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkfcw\" (UniqueName: \"kubernetes.io/projected/99c26209-ccf9-4539-a445-e8ee5dff3a2a-kube-api-access-wkfcw\") pod \"redhat-marketplace-j6kbf\" (UID: \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\") " pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.390351 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c26209-ccf9-4539-a445-e8ee5dff3a2a-utilities\") pod \"redhat-marketplace-j6kbf\" (UID: \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\") " pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.390774 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c26209-ccf9-4539-a445-e8ee5dff3a2a-catalog-content\") pod \"redhat-marketplace-j6kbf\" (UID: \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\") " pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.412682 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkfcw\" (UniqueName: \"kubernetes.io/projected/99c26209-ccf9-4539-a445-e8ee5dff3a2a-kube-api-access-wkfcw\") pod \"redhat-marketplace-j6kbf\" (UID: \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\") " pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.442109 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.480329 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:27 crc kubenswrapper[4594]: I1129 06:43:27.596202 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:28 crc kubenswrapper[4594]: I1129 06:43:28.040944 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6kbf"] Nov 29 06:43:28 crc kubenswrapper[4594]: I1129 06:43:28.481028 4594 generic.go:334] "Generic (PLEG): container finished" podID="99c26209-ccf9-4539-a445-e8ee5dff3a2a" containerID="afac8a959f92f46548c93ae4bc91596020e642cff661122ef2dbe97d81186502" exitCode=0 Nov 29 06:43:28 crc kubenswrapper[4594]: I1129 06:43:28.481073 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6kbf" event={"ID":"99c26209-ccf9-4539-a445-e8ee5dff3a2a","Type":"ContainerDied","Data":"afac8a959f92f46548c93ae4bc91596020e642cff661122ef2dbe97d81186502"} Nov 29 06:43:28 crc kubenswrapper[4594]: I1129 06:43:28.481355 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6kbf" event={"ID":"99c26209-ccf9-4539-a445-e8ee5dff3a2a","Type":"ContainerStarted","Data":"89f9d07ab5bfb9db7d49c9d820d3ebff76e5a487fdf4e1cc1869534397484492"} Nov 29 06:43:29 crc kubenswrapper[4594]: I1129 06:43:29.856980 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mvr2p"] Nov 29 06:43:29 crc kubenswrapper[4594]: I1129 06:43:29.857715 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mvr2p" podUID="af2678ef-1558-488d-b708-027b67179781" containerName="registry-server" containerID="cri-o://65e27d051a15724e12dc2569f10768adfdb2fc04fc30ab289a3f5a127315171b" gracePeriod=2 Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.260666 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.359552 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n67tk\" (UniqueName: \"kubernetes.io/projected/af2678ef-1558-488d-b708-027b67179781-kube-api-access-n67tk\") pod \"af2678ef-1558-488d-b708-027b67179781\" (UID: \"af2678ef-1558-488d-b708-027b67179781\") " Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.371756 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af2678ef-1558-488d-b708-027b67179781-kube-api-access-n67tk" (OuterVolumeSpecName: "kube-api-access-n67tk") pod "af2678ef-1558-488d-b708-027b67179781" (UID: "af2678ef-1558-488d-b708-027b67179781"). InnerVolumeSpecName "kube-api-access-n67tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.462141 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af2678ef-1558-488d-b708-027b67179781-utilities\") pod \"af2678ef-1558-488d-b708-027b67179781\" (UID: \"af2678ef-1558-488d-b708-027b67179781\") " Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.462333 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af2678ef-1558-488d-b708-027b67179781-catalog-content\") pod \"af2678ef-1558-488d-b708-027b67179781\" (UID: \"af2678ef-1558-488d-b708-027b67179781\") " Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.463174 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af2678ef-1558-488d-b708-027b67179781-utilities" (OuterVolumeSpecName: "utilities") pod "af2678ef-1558-488d-b708-027b67179781" (UID: "af2678ef-1558-488d-b708-027b67179781"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.463592 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n67tk\" (UniqueName: \"kubernetes.io/projected/af2678ef-1558-488d-b708-027b67179781-kube-api-access-n67tk\") on node \"crc\" DevicePath \"\"" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.463616 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af2678ef-1558-488d-b708-027b67179781-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.503682 4594 generic.go:334] "Generic (PLEG): container finished" podID="99c26209-ccf9-4539-a445-e8ee5dff3a2a" containerID="60a3dc53e2d8b16c349cfd659bc386dd755aef55509cde4a54ee7f79b2a6ba69" exitCode=0 Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.503744 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6kbf" event={"ID":"99c26209-ccf9-4539-a445-e8ee5dff3a2a","Type":"ContainerDied","Data":"60a3dc53e2d8b16c349cfd659bc386dd755aef55509cde4a54ee7f79b2a6ba69"} Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.506380 4594 generic.go:334] "Generic (PLEG): container finished" podID="af2678ef-1558-488d-b708-027b67179781" containerID="65e27d051a15724e12dc2569f10768adfdb2fc04fc30ab289a3f5a127315171b" exitCode=0 Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.506430 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvr2p" event={"ID":"af2678ef-1558-488d-b708-027b67179781","Type":"ContainerDied","Data":"65e27d051a15724e12dc2569f10768adfdb2fc04fc30ab289a3f5a127315171b"} Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.506453 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvr2p" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.506474 4594 scope.go:117] "RemoveContainer" containerID="65e27d051a15724e12dc2569f10768adfdb2fc04fc30ab289a3f5a127315171b" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.506462 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvr2p" event={"ID":"af2678ef-1558-488d-b708-027b67179781","Type":"ContainerDied","Data":"679eedaba3a3f008912224023c8f8a5ff350c5dfb0f317e0bcaaa56e51349b82"} Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.530705 4594 scope.go:117] "RemoveContainer" containerID="faac9117db48b501c37a58d90d5a20537bf0a486601dba1faa5bf692645062ca" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.562511 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af2678ef-1558-488d-b708-027b67179781-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af2678ef-1558-488d-b708-027b67179781" (UID: "af2678ef-1558-488d-b708-027b67179781"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.567156 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af2678ef-1558-488d-b708-027b67179781-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.578415 4594 scope.go:117] "RemoveContainer" containerID="68a0b63e7adbbad1ca8a375697b817c9187bd9857197b9f4f7939ff237eaf3ed" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.630403 4594 scope.go:117] "RemoveContainer" containerID="65e27d051a15724e12dc2569f10768adfdb2fc04fc30ab289a3f5a127315171b" Nov 29 06:43:30 crc kubenswrapper[4594]: E1129 06:43:30.638055 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65e27d051a15724e12dc2569f10768adfdb2fc04fc30ab289a3f5a127315171b\": container with ID starting with 65e27d051a15724e12dc2569f10768adfdb2fc04fc30ab289a3f5a127315171b not found: ID does not exist" containerID="65e27d051a15724e12dc2569f10768adfdb2fc04fc30ab289a3f5a127315171b" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.638096 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e27d051a15724e12dc2569f10768adfdb2fc04fc30ab289a3f5a127315171b"} err="failed to get container status \"65e27d051a15724e12dc2569f10768adfdb2fc04fc30ab289a3f5a127315171b\": rpc error: code = NotFound desc = could not find container \"65e27d051a15724e12dc2569f10768adfdb2fc04fc30ab289a3f5a127315171b\": container with ID starting with 65e27d051a15724e12dc2569f10768adfdb2fc04fc30ab289a3f5a127315171b not found: ID does not exist" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.638124 4594 scope.go:117] "RemoveContainer" containerID="faac9117db48b501c37a58d90d5a20537bf0a486601dba1faa5bf692645062ca" Nov 29 06:43:30 crc kubenswrapper[4594]: E1129 06:43:30.641340 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faac9117db48b501c37a58d90d5a20537bf0a486601dba1faa5bf692645062ca\": container with ID starting with faac9117db48b501c37a58d90d5a20537bf0a486601dba1faa5bf692645062ca not found: ID does not exist" containerID="faac9117db48b501c37a58d90d5a20537bf0a486601dba1faa5bf692645062ca" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.641378 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faac9117db48b501c37a58d90d5a20537bf0a486601dba1faa5bf692645062ca"} err="failed to get container status \"faac9117db48b501c37a58d90d5a20537bf0a486601dba1faa5bf692645062ca\": rpc error: code = NotFound desc = could not find container \"faac9117db48b501c37a58d90d5a20537bf0a486601dba1faa5bf692645062ca\": container with ID starting with faac9117db48b501c37a58d90d5a20537bf0a486601dba1faa5bf692645062ca not found: ID does not exist" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.641403 4594 scope.go:117] "RemoveContainer" containerID="68a0b63e7adbbad1ca8a375697b817c9187bd9857197b9f4f7939ff237eaf3ed" Nov 29 06:43:30 crc kubenswrapper[4594]: E1129 06:43:30.645469 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68a0b63e7adbbad1ca8a375697b817c9187bd9857197b9f4f7939ff237eaf3ed\": container with ID starting with 68a0b63e7adbbad1ca8a375697b817c9187bd9857197b9f4f7939ff237eaf3ed not found: ID does not exist" containerID="68a0b63e7adbbad1ca8a375697b817c9187bd9857197b9f4f7939ff237eaf3ed" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.645522 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a0b63e7adbbad1ca8a375697b817c9187bd9857197b9f4f7939ff237eaf3ed"} err="failed to get container status \"68a0b63e7adbbad1ca8a375697b817c9187bd9857197b9f4f7939ff237eaf3ed\": rpc error: code = NotFound desc = could not find container \"68a0b63e7adbbad1ca8a375697b817c9187bd9857197b9f4f7939ff237eaf3ed\": container with ID starting with 68a0b63e7adbbad1ca8a375697b817c9187bd9857197b9f4f7939ff237eaf3ed not found: ID does not exist" Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.841976 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mvr2p"] Nov 29 06:43:30 crc kubenswrapper[4594]: I1129 06:43:30.856542 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mvr2p"] Nov 29 06:43:31 crc kubenswrapper[4594]: I1129 06:43:31.532904 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6kbf" event={"ID":"99c26209-ccf9-4539-a445-e8ee5dff3a2a","Type":"ContainerStarted","Data":"676b1e823f9736db671ba859f0488c5fe592a9178e2791a6ec551cc4035eb003"} Nov 29 06:43:31 crc kubenswrapper[4594]: I1129 06:43:31.555655 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j6kbf" podStartSLOduration=1.92412486 podStartE2EDuration="4.555638581s" podCreationTimestamp="2025-11-29 06:43:27 +0000 UTC" firstStartedPulling="2025-11-29 06:43:28.482622113 +0000 UTC m=+4532.723131332" lastFinishedPulling="2025-11-29 06:43:31.114135833 +0000 UTC m=+4535.354645053" observedRunningTime="2025-11-29 06:43:31.547562469 +0000 UTC m=+4535.788071689" watchObservedRunningTime="2025-11-29 06:43:31.555638581 +0000 UTC m=+4535.796147802" Nov 29 06:43:32 crc kubenswrapper[4594]: I1129 06:43:32.100986 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af2678ef-1558-488d-b708-027b67179781" path="/var/lib/kubelet/pods/af2678ef-1558-488d-b708-027b67179781/volumes" Nov 29 06:43:36 crc kubenswrapper[4594]: I1129 06:43:36.405599 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/util/0.log" Nov 29 06:43:36 crc kubenswrapper[4594]: I1129 06:43:36.582099 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/util/0.log" Nov 29 06:43:36 crc kubenswrapper[4594]: I1129 06:43:36.585987 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/pull/0.log" Nov 29 06:43:36 crc kubenswrapper[4594]: I1129 06:43:36.589580 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/pull/0.log" Nov 29 06:43:36 crc kubenswrapper[4594]: I1129 06:43:36.753169 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/util/0.log" Nov 29 06:43:36 crc kubenswrapper[4594]: I1129 06:43:36.763035 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/pull/0.log" Nov 29 06:43:36 crc kubenswrapper[4594]: I1129 06:43:36.808364 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/extract/0.log" Nov 29 06:43:36 crc kubenswrapper[4594]: I1129 06:43:36.915734 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/util/0.log" Nov 29 06:43:37 crc kubenswrapper[4594]: I1129 06:43:37.058649 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/util/0.log" Nov 29 06:43:37 crc kubenswrapper[4594]: I1129 06:43:37.062251 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/pull/0.log" Nov 29 06:43:37 crc kubenswrapper[4594]: I1129 06:43:37.076678 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/pull/0.log" Nov 29 06:43:37 crc kubenswrapper[4594]: I1129 06:43:37.596555 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:37 crc kubenswrapper[4594]: I1129 06:43:37.596867 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:37 crc kubenswrapper[4594]: I1129 06:43:37.642472 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:37 crc kubenswrapper[4594]: I1129 06:43:37.896789 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/util/0.log" Nov 29 06:43:37 crc kubenswrapper[4594]: I1129 06:43:37.903317 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/pull/0.log" Nov 29 06:43:37 crc kubenswrapper[4594]: I1129 06:43:37.903684 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/extract/0.log" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.045953 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/util/0.log" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.237476 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/util/0.log" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.239750 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/pull/0.log" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.268838 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/pull/0.log" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.417133 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/pull/0.log" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.435071 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/extract/0.log" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.439705 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/util/0.log" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.581102 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/extract-utilities/0.log" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.629515 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.676429 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6kbf"] Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.716088 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/extract-utilities/0.log" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.744439 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/extract-content/0.log" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.770182 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/extract-content/0.log" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.893489 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/extract-utilities/0.log" Nov 29 06:43:38 crc kubenswrapper[4594]: I1129 06:43:38.911845 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/extract-content/0.log" Nov 29 06:43:39 crc kubenswrapper[4594]: I1129 06:43:39.462213 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/extract-utilities/0.log" Nov 29 06:43:39 crc kubenswrapper[4594]: I1129 06:43:39.541113 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/registry-server/0.log" Nov 29 06:43:39 crc kubenswrapper[4594]: I1129 06:43:39.727050 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/extract-content/0.log" Nov 29 06:43:39 crc kubenswrapper[4594]: I1129 06:43:39.740218 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/extract-utilities/0.log" Nov 29 06:43:39 crc kubenswrapper[4594]: I1129 06:43:39.742572 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/extract-content/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.092589 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/extract-utilities/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.096937 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-clxws_8b4bf1cb-440a-4e74-82c4-c122e9985bf3/marketplace-operator/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.100818 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/extract-content/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.290846 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/extract-utilities/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.334713 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/registry-server/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.448333 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/extract-utilities/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.484039 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/extract-content/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.485767 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/extract-content/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.605487 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j6kbf" podUID="99c26209-ccf9-4539-a445-e8ee5dff3a2a" containerName="registry-server" containerID="cri-o://676b1e823f9736db671ba859f0488c5fe592a9178e2791a6ec551cc4035eb003" gracePeriod=2 Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.613538 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/extract-utilities/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.614196 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/extract-content/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.669804 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j6kbf_99c26209-ccf9-4539-a445-e8ee5dff3a2a/extract-utilities/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.760566 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/registry-server/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.839161 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j6kbf_99c26209-ccf9-4539-a445-e8ee5dff3a2a/extract-utilities/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.905208 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j6kbf_99c26209-ccf9-4539-a445-e8ee5dff3a2a/extract-content/0.log" Nov 29 06:43:40 crc kubenswrapper[4594]: I1129 06:43:40.905214 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j6kbf_99c26209-ccf9-4539-a445-e8ee5dff3a2a/extract-content/0.log" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.021608 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.109377 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j6kbf_99c26209-ccf9-4539-a445-e8ee5dff3a2a/extract-utilities/0.log" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.121278 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c26209-ccf9-4539-a445-e8ee5dff3a2a-utilities\") pod \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\" (UID: \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\") " Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.121347 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkfcw\" (UniqueName: \"kubernetes.io/projected/99c26209-ccf9-4539-a445-e8ee5dff3a2a-kube-api-access-wkfcw\") pod \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\" (UID: \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\") " Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.121372 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c26209-ccf9-4539-a445-e8ee5dff3a2a-catalog-content\") pod \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\" (UID: \"99c26209-ccf9-4539-a445-e8ee5dff3a2a\") " Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.121995 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99c26209-ccf9-4539-a445-e8ee5dff3a2a-utilities" (OuterVolumeSpecName: "utilities") pod "99c26209-ccf9-4539-a445-e8ee5dff3a2a" (UID: "99c26209-ccf9-4539-a445-e8ee5dff3a2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.122410 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c26209-ccf9-4539-a445-e8ee5dff3a2a-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.124520 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j6kbf_99c26209-ccf9-4539-a445-e8ee5dff3a2a/extract-content/0.log" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.142417 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c26209-ccf9-4539-a445-e8ee5dff3a2a-kube-api-access-wkfcw" (OuterVolumeSpecName: "kube-api-access-wkfcw") pod "99c26209-ccf9-4539-a445-e8ee5dff3a2a" (UID: "99c26209-ccf9-4539-a445-e8ee5dff3a2a"). InnerVolumeSpecName "kube-api-access-wkfcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.142855 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99c26209-ccf9-4539-a445-e8ee5dff3a2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99c26209-ccf9-4539-a445-e8ee5dff3a2a" (UID: "99c26209-ccf9-4539-a445-e8ee5dff3a2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.157002 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/extract-utilities/0.log" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.157737 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j6kbf_99c26209-ccf9-4539-a445-e8ee5dff3a2a/registry-server/0.log" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.223372 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkfcw\" (UniqueName: \"kubernetes.io/projected/99c26209-ccf9-4539-a445-e8ee5dff3a2a-kube-api-access-wkfcw\") on node \"crc\" DevicePath \"\"" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.223402 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c26209-ccf9-4539-a445-e8ee5dff3a2a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.319999 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/extract-content/0.log" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.339429 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/extract-content/0.log" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.345629 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/extract-utilities/0.log" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.468178 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/extract-utilities/0.log" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.532125 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/extract-content/0.log" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.616333 4594 generic.go:334] "Generic (PLEG): container finished" podID="99c26209-ccf9-4539-a445-e8ee5dff3a2a" containerID="676b1e823f9736db671ba859f0488c5fe592a9178e2791a6ec551cc4035eb003" exitCode=0 Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.616381 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6kbf" event={"ID":"99c26209-ccf9-4539-a445-e8ee5dff3a2a","Type":"ContainerDied","Data":"676b1e823f9736db671ba859f0488c5fe592a9178e2791a6ec551cc4035eb003"} Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.616410 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6kbf" event={"ID":"99c26209-ccf9-4539-a445-e8ee5dff3a2a","Type":"ContainerDied","Data":"89f9d07ab5bfb9db7d49c9d820d3ebff76e5a487fdf4e1cc1869534397484492"} Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.616432 4594 scope.go:117] "RemoveContainer" containerID="676b1e823f9736db671ba859f0488c5fe592a9178e2791a6ec551cc4035eb003" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.616577 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6kbf" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.635802 4594 scope.go:117] "RemoveContainer" containerID="60a3dc53e2d8b16c349cfd659bc386dd755aef55509cde4a54ee7f79b2a6ba69" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.699577 4594 scope.go:117] "RemoveContainer" containerID="afac8a959f92f46548c93ae4bc91596020e642cff661122ef2dbe97d81186502" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.719909 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6kbf"] Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.722991 4594 scope.go:117] "RemoveContainer" containerID="676b1e823f9736db671ba859f0488c5fe592a9178e2791a6ec551cc4035eb003" Nov 29 06:43:41 crc kubenswrapper[4594]: E1129 06:43:41.723671 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"676b1e823f9736db671ba859f0488c5fe592a9178e2791a6ec551cc4035eb003\": container with ID starting with 676b1e823f9736db671ba859f0488c5fe592a9178e2791a6ec551cc4035eb003 not found: ID does not exist" containerID="676b1e823f9736db671ba859f0488c5fe592a9178e2791a6ec551cc4035eb003" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.723749 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"676b1e823f9736db671ba859f0488c5fe592a9178e2791a6ec551cc4035eb003"} err="failed to get container status \"676b1e823f9736db671ba859f0488c5fe592a9178e2791a6ec551cc4035eb003\": rpc error: code = NotFound desc = could not find container \"676b1e823f9736db671ba859f0488c5fe592a9178e2791a6ec551cc4035eb003\": container with ID starting with 676b1e823f9736db671ba859f0488c5fe592a9178e2791a6ec551cc4035eb003 not found: ID does not exist" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.723777 4594 scope.go:117] "RemoveContainer" containerID="60a3dc53e2d8b16c349cfd659bc386dd755aef55509cde4a54ee7f79b2a6ba69" Nov 29 06:43:41 crc kubenswrapper[4594]: E1129 06:43:41.724274 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60a3dc53e2d8b16c349cfd659bc386dd755aef55509cde4a54ee7f79b2a6ba69\": container with ID starting with 60a3dc53e2d8b16c349cfd659bc386dd755aef55509cde4a54ee7f79b2a6ba69 not found: ID does not exist" containerID="60a3dc53e2d8b16c349cfd659bc386dd755aef55509cde4a54ee7f79b2a6ba69" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.724295 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a3dc53e2d8b16c349cfd659bc386dd755aef55509cde4a54ee7f79b2a6ba69"} err="failed to get container status \"60a3dc53e2d8b16c349cfd659bc386dd755aef55509cde4a54ee7f79b2a6ba69\": rpc error: code = NotFound desc = could not find container \"60a3dc53e2d8b16c349cfd659bc386dd755aef55509cde4a54ee7f79b2a6ba69\": container with ID starting with 60a3dc53e2d8b16c349cfd659bc386dd755aef55509cde4a54ee7f79b2a6ba69 not found: ID does not exist" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.724312 4594 scope.go:117] "RemoveContainer" containerID="afac8a959f92f46548c93ae4bc91596020e642cff661122ef2dbe97d81186502" Nov 29 06:43:41 crc kubenswrapper[4594]: E1129 06:43:41.724666 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afac8a959f92f46548c93ae4bc91596020e642cff661122ef2dbe97d81186502\": container with ID starting with afac8a959f92f46548c93ae4bc91596020e642cff661122ef2dbe97d81186502 not found: ID does not exist" containerID="afac8a959f92f46548c93ae4bc91596020e642cff661122ef2dbe97d81186502" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.724683 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afac8a959f92f46548c93ae4bc91596020e642cff661122ef2dbe97d81186502"} err="failed to get container status \"afac8a959f92f46548c93ae4bc91596020e642cff661122ef2dbe97d81186502\": rpc error: code = NotFound desc = could not find container \"afac8a959f92f46548c93ae4bc91596020e642cff661122ef2dbe97d81186502\": container with ID starting with afac8a959f92f46548c93ae4bc91596020e642cff661122ef2dbe97d81186502 not found: ID does not exist" Nov 29 06:43:41 crc kubenswrapper[4594]: I1129 06:43:41.732941 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6kbf"] Nov 29 06:43:42 crc kubenswrapper[4594]: I1129 06:43:42.069472 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/registry-server/0.log" Nov 29 06:43:42 crc kubenswrapper[4594]: I1129 06:43:42.093073 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99c26209-ccf9-4539-a445-e8ee5dff3a2a" path="/var/lib/kubelet/pods/99c26209-ccf9-4539-a445-e8ee5dff3a2a/volumes" Nov 29 06:43:45 crc kubenswrapper[4594]: I1129 06:43:45.799847 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:43:45 crc kubenswrapper[4594]: I1129 06:43:45.800537 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:43:53 crc kubenswrapper[4594]: I1129 06:43:53.810680 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-lj6hq_862c89f5-7442-4e20-8677-ef780f71545d/prometheus-operator/0.log" Nov 29 06:43:53 crc kubenswrapper[4594]: I1129 06:43:53.923511 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv_8ec2d68d-f88d-411c-9790-4fc800a02905/prometheus-operator-admission-webhook/0.log" Nov 29 06:43:53 crc kubenswrapper[4594]: I1129 06:43:53.964367 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q_0283828f-9a3f-4c00-8409-a49231f3b953/prometheus-operator-admission-webhook/0.log" Nov 29 06:43:54 crc kubenswrapper[4594]: I1129 06:43:54.117616 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-l25w2_57d47f4c-b8cb-4c20-9adb-2e9190e48f82/operator/0.log" Nov 29 06:43:54 crc kubenswrapper[4594]: I1129 06:43:54.159892 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-rt8w5_ad950138-e417-48eb-a3e9-a5c575d4507f/perses-operator/0.log" Nov 29 06:44:15 crc kubenswrapper[4594]: I1129 06:44:15.800347 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:44:15 crc kubenswrapper[4594]: I1129 06:44:15.800949 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:44:15 crc kubenswrapper[4594]: I1129 06:44:15.801004 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 06:44:15 crc kubenswrapper[4594]: I1129 06:44:15.801675 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 06:44:15 crc kubenswrapper[4594]: I1129 06:44:15.801724 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" gracePeriod=600 Nov 29 06:44:15 crc kubenswrapper[4594]: E1129 06:44:15.922189 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:44:15 crc kubenswrapper[4594]: I1129 06:44:15.957478 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" exitCode=0 Nov 29 06:44:15 crc kubenswrapper[4594]: I1129 06:44:15.957526 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84"} Nov 29 06:44:15 crc kubenswrapper[4594]: I1129 06:44:15.957572 4594 scope.go:117] "RemoveContainer" containerID="1357d3affb462aa55998dd14fc93fd9a1c6bafde00c8c879ab529e678190969a" Nov 29 06:44:15 crc kubenswrapper[4594]: I1129 06:44:15.958890 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:44:15 crc kubenswrapper[4594]: E1129 06:44:15.959487 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:44:31 crc kubenswrapper[4594]: I1129 06:44:31.083154 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:44:31 crc kubenswrapper[4594]: E1129 06:44:31.084335 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:44:44 crc kubenswrapper[4594]: I1129 06:44:44.084593 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:44:44 crc kubenswrapper[4594]: E1129 06:44:44.085532 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:44:56 crc kubenswrapper[4594]: I1129 06:44:56.091020 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:44:56 crc kubenswrapper[4594]: E1129 06:44:56.091955 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.160609 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf"] Nov 29 06:45:00 crc kubenswrapper[4594]: E1129 06:45:00.161736 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2678ef-1558-488d-b708-027b67179781" containerName="extract-utilities" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.161755 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2678ef-1558-488d-b708-027b67179781" containerName="extract-utilities" Nov 29 06:45:00 crc kubenswrapper[4594]: E1129 06:45:00.161779 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c26209-ccf9-4539-a445-e8ee5dff3a2a" containerName="extract-content" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.161785 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c26209-ccf9-4539-a445-e8ee5dff3a2a" containerName="extract-content" Nov 29 06:45:00 crc kubenswrapper[4594]: E1129 06:45:00.161801 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2678ef-1558-488d-b708-027b67179781" containerName="registry-server" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.161806 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2678ef-1558-488d-b708-027b67179781" containerName="registry-server" Nov 29 06:45:00 crc kubenswrapper[4594]: E1129 06:45:00.161823 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c26209-ccf9-4539-a445-e8ee5dff3a2a" containerName="extract-utilities" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.161830 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c26209-ccf9-4539-a445-e8ee5dff3a2a" containerName="extract-utilities" Nov 29 06:45:00 crc kubenswrapper[4594]: E1129 06:45:00.161847 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2678ef-1558-488d-b708-027b67179781" containerName="extract-content" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.161853 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2678ef-1558-488d-b708-027b67179781" containerName="extract-content" Nov 29 06:45:00 crc kubenswrapper[4594]: E1129 06:45:00.161864 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c26209-ccf9-4539-a445-e8ee5dff3a2a" containerName="registry-server" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.161871 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c26209-ccf9-4539-a445-e8ee5dff3a2a" containerName="registry-server" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.162141 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c26209-ccf9-4539-a445-e8ee5dff3a2a" containerName="registry-server" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.162157 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2678ef-1558-488d-b708-027b67179781" containerName="registry-server" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.163001 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.165175 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.165636 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.170454 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf"] Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.259101 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7af72438-75a5-4e7c-b918-feb6653dd2b8-secret-volume\") pod \"collect-profiles-29406645-56pkf\" (UID: \"7af72438-75a5-4e7c-b918-feb6653dd2b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.259427 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7af72438-75a5-4e7c-b918-feb6653dd2b8-config-volume\") pod \"collect-profiles-29406645-56pkf\" (UID: \"7af72438-75a5-4e7c-b918-feb6653dd2b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.259557 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b9q5\" (UniqueName: \"kubernetes.io/projected/7af72438-75a5-4e7c-b918-feb6653dd2b8-kube-api-access-8b9q5\") pod \"collect-profiles-29406645-56pkf\" (UID: \"7af72438-75a5-4e7c-b918-feb6653dd2b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.361500 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7af72438-75a5-4e7c-b918-feb6653dd2b8-secret-volume\") pod \"collect-profiles-29406645-56pkf\" (UID: \"7af72438-75a5-4e7c-b918-feb6653dd2b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.361870 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7af72438-75a5-4e7c-b918-feb6653dd2b8-config-volume\") pod \"collect-profiles-29406645-56pkf\" (UID: \"7af72438-75a5-4e7c-b918-feb6653dd2b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.361911 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b9q5\" (UniqueName: \"kubernetes.io/projected/7af72438-75a5-4e7c-b918-feb6653dd2b8-kube-api-access-8b9q5\") pod \"collect-profiles-29406645-56pkf\" (UID: \"7af72438-75a5-4e7c-b918-feb6653dd2b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.362927 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7af72438-75a5-4e7c-b918-feb6653dd2b8-config-volume\") pod \"collect-profiles-29406645-56pkf\" (UID: \"7af72438-75a5-4e7c-b918-feb6653dd2b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.383136 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7af72438-75a5-4e7c-b918-feb6653dd2b8-secret-volume\") pod \"collect-profiles-29406645-56pkf\" (UID: \"7af72438-75a5-4e7c-b918-feb6653dd2b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.385209 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b9q5\" (UniqueName: \"kubernetes.io/projected/7af72438-75a5-4e7c-b918-feb6653dd2b8-kube-api-access-8b9q5\") pod \"collect-profiles-29406645-56pkf\" (UID: \"7af72438-75a5-4e7c-b918-feb6653dd2b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.488113 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" Nov 29 06:45:00 crc kubenswrapper[4594]: I1129 06:45:00.898937 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf"] Nov 29 06:45:01 crc kubenswrapper[4594]: I1129 06:45:01.417113 4594 generic.go:334] "Generic (PLEG): container finished" podID="7af72438-75a5-4e7c-b918-feb6653dd2b8" containerID="6b474836096e50fdb15f4654f11c8b6860bebceb84be8021dfa2253c65e1e813" exitCode=0 Nov 29 06:45:01 crc kubenswrapper[4594]: I1129 06:45:01.417173 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" event={"ID":"7af72438-75a5-4e7c-b918-feb6653dd2b8","Type":"ContainerDied","Data":"6b474836096e50fdb15f4654f11c8b6860bebceb84be8021dfa2253c65e1e813"} Nov 29 06:45:01 crc kubenswrapper[4594]: I1129 06:45:01.417229 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" event={"ID":"7af72438-75a5-4e7c-b918-feb6653dd2b8","Type":"ContainerStarted","Data":"5a58f7b44e117a5c32d67cb8f4f23578d3c60cd34a119a5df8a3f202d971c7c0"} Nov 29 06:45:02 crc kubenswrapper[4594]: I1129 06:45:02.737735 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" Nov 29 06:45:02 crc kubenswrapper[4594]: I1129 06:45:02.919309 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7af72438-75a5-4e7c-b918-feb6653dd2b8-config-volume\") pod \"7af72438-75a5-4e7c-b918-feb6653dd2b8\" (UID: \"7af72438-75a5-4e7c-b918-feb6653dd2b8\") " Nov 29 06:45:02 crc kubenswrapper[4594]: I1129 06:45:02.919409 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7af72438-75a5-4e7c-b918-feb6653dd2b8-secret-volume\") pod \"7af72438-75a5-4e7c-b918-feb6653dd2b8\" (UID: \"7af72438-75a5-4e7c-b918-feb6653dd2b8\") " Nov 29 06:45:02 crc kubenswrapper[4594]: I1129 06:45:02.919488 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b9q5\" (UniqueName: \"kubernetes.io/projected/7af72438-75a5-4e7c-b918-feb6653dd2b8-kube-api-access-8b9q5\") pod \"7af72438-75a5-4e7c-b918-feb6653dd2b8\" (UID: \"7af72438-75a5-4e7c-b918-feb6653dd2b8\") " Nov 29 06:45:02 crc kubenswrapper[4594]: I1129 06:45:02.920194 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af72438-75a5-4e7c-b918-feb6653dd2b8-config-volume" (OuterVolumeSpecName: "config-volume") pod "7af72438-75a5-4e7c-b918-feb6653dd2b8" (UID: "7af72438-75a5-4e7c-b918-feb6653dd2b8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 06:45:02 crc kubenswrapper[4594]: I1129 06:45:02.922094 4594 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7af72438-75a5-4e7c-b918-feb6653dd2b8-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 06:45:02 crc kubenswrapper[4594]: I1129 06:45:02.925901 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af72438-75a5-4e7c-b918-feb6653dd2b8-kube-api-access-8b9q5" (OuterVolumeSpecName: "kube-api-access-8b9q5") pod "7af72438-75a5-4e7c-b918-feb6653dd2b8" (UID: "7af72438-75a5-4e7c-b918-feb6653dd2b8"). InnerVolumeSpecName "kube-api-access-8b9q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:45:02 crc kubenswrapper[4594]: I1129 06:45:02.926104 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7af72438-75a5-4e7c-b918-feb6653dd2b8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7af72438-75a5-4e7c-b918-feb6653dd2b8" (UID: "7af72438-75a5-4e7c-b918-feb6653dd2b8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 06:45:03 crc kubenswrapper[4594]: I1129 06:45:03.024229 4594 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7af72438-75a5-4e7c-b918-feb6653dd2b8-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 06:45:03 crc kubenswrapper[4594]: I1129 06:45:03.024285 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b9q5\" (UniqueName: \"kubernetes.io/projected/7af72438-75a5-4e7c-b918-feb6653dd2b8-kube-api-access-8b9q5\") on node \"crc\" DevicePath \"\"" Nov 29 06:45:03 crc kubenswrapper[4594]: I1129 06:45:03.437546 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" event={"ID":"7af72438-75a5-4e7c-b918-feb6653dd2b8","Type":"ContainerDied","Data":"5a58f7b44e117a5c32d67cb8f4f23578d3c60cd34a119a5df8a3f202d971c7c0"} Nov 29 06:45:03 crc kubenswrapper[4594]: I1129 06:45:03.437601 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406645-56pkf" Nov 29 06:45:03 crc kubenswrapper[4594]: I1129 06:45:03.437606 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a58f7b44e117a5c32d67cb8f4f23578d3c60cd34a119a5df8a3f202d971c7c0" Nov 29 06:45:03 crc kubenswrapper[4594]: I1129 06:45:03.810268 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l"] Nov 29 06:45:03 crc kubenswrapper[4594]: I1129 06:45:03.820669 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406600-xdz2l"] Nov 29 06:45:04 crc kubenswrapper[4594]: I1129 06:45:04.094631 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888" path="/var/lib/kubelet/pods/1cdeaa0f-a263-4df7-b6dc-7aa46a5cf888/volumes" Nov 29 06:45:11 crc kubenswrapper[4594]: I1129 06:45:11.083329 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:45:11 crc kubenswrapper[4594]: E1129 06:45:11.083940 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:45:11 crc kubenswrapper[4594]: I1129 06:45:11.231032 4594 scope.go:117] "RemoveContainer" containerID="bf251573be3f1fab04ba9f80127e4646b742104bba129cdba0648bee4c952979" Nov 29 06:45:23 crc kubenswrapper[4594]: I1129 06:45:23.083610 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:45:23 crc kubenswrapper[4594]: E1129 06:45:23.084700 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:45:32 crc kubenswrapper[4594]: I1129 06:45:32.747959 4594 generic.go:334] "Generic (PLEG): container finished" podID="70da90c5-e357-44f5-8f87-d2b63c1edc68" containerID="9ff0d7c88661cf1adbb32216d56dfc0de7d8d85592b38f08981d7cef375f2860" exitCode=0 Nov 29 06:45:32 crc kubenswrapper[4594]: I1129 06:45:32.748072 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x47cb/must-gather-2dptg" event={"ID":"70da90c5-e357-44f5-8f87-d2b63c1edc68","Type":"ContainerDied","Data":"9ff0d7c88661cf1adbb32216d56dfc0de7d8d85592b38f08981d7cef375f2860"} Nov 29 06:45:32 crc kubenswrapper[4594]: I1129 06:45:32.749499 4594 scope.go:117] "RemoveContainer" containerID="9ff0d7c88661cf1adbb32216d56dfc0de7d8d85592b38f08981d7cef375f2860" Nov 29 06:45:33 crc kubenswrapper[4594]: I1129 06:45:33.809951 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x47cb_must-gather-2dptg_70da90c5-e357-44f5-8f87-d2b63c1edc68/gather/0.log" Nov 29 06:45:36 crc kubenswrapper[4594]: I1129 06:45:36.088510 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:45:36 crc kubenswrapper[4594]: E1129 06:45:36.089022 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.152304 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x47cb/must-gather-2dptg"] Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.153020 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-x47cb/must-gather-2dptg" podUID="70da90c5-e357-44f5-8f87-d2b63c1edc68" containerName="copy" containerID="cri-o://7175a03672eec6b008bf44a0701c9edcb72c9c1be5c592d78b1714423a7d2c88" gracePeriod=2 Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.170273 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x47cb/must-gather-2dptg"] Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.534492 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x47cb_must-gather-2dptg_70da90c5-e357-44f5-8f87-d2b63c1edc68/copy/0.log" Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.535367 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/must-gather-2dptg" Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.619747 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70da90c5-e357-44f5-8f87-d2b63c1edc68-must-gather-output\") pod \"70da90c5-e357-44f5-8f87-d2b63c1edc68\" (UID: \"70da90c5-e357-44f5-8f87-d2b63c1edc68\") " Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.619949 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhh6c\" (UniqueName: \"kubernetes.io/projected/70da90c5-e357-44f5-8f87-d2b63c1edc68-kube-api-access-rhh6c\") pod \"70da90c5-e357-44f5-8f87-d2b63c1edc68\" (UID: \"70da90c5-e357-44f5-8f87-d2b63c1edc68\") " Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.628296 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70da90c5-e357-44f5-8f87-d2b63c1edc68-kube-api-access-rhh6c" (OuterVolumeSpecName: "kube-api-access-rhh6c") pod "70da90c5-e357-44f5-8f87-d2b63c1edc68" (UID: "70da90c5-e357-44f5-8f87-d2b63c1edc68"). InnerVolumeSpecName "kube-api-access-rhh6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.722779 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhh6c\" (UniqueName: \"kubernetes.io/projected/70da90c5-e357-44f5-8f87-d2b63c1edc68-kube-api-access-rhh6c\") on node \"crc\" DevicePath \"\"" Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.779165 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70da90c5-e357-44f5-8f87-d2b63c1edc68-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "70da90c5-e357-44f5-8f87-d2b63c1edc68" (UID: "70da90c5-e357-44f5-8f87-d2b63c1edc68"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.827381 4594 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70da90c5-e357-44f5-8f87-d2b63c1edc68-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.891060 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x47cb_must-gather-2dptg_70da90c5-e357-44f5-8f87-d2b63c1edc68/copy/0.log" Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.893842 4594 generic.go:334] "Generic (PLEG): container finished" podID="70da90c5-e357-44f5-8f87-d2b63c1edc68" containerID="7175a03672eec6b008bf44a0701c9edcb72c9c1be5c592d78b1714423a7d2c88" exitCode=143 Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.893905 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x47cb/must-gather-2dptg" Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.893919 4594 scope.go:117] "RemoveContainer" containerID="7175a03672eec6b008bf44a0701c9edcb72c9c1be5c592d78b1714423a7d2c88" Nov 29 06:45:42 crc kubenswrapper[4594]: I1129 06:45:42.943569 4594 scope.go:117] "RemoveContainer" containerID="9ff0d7c88661cf1adbb32216d56dfc0de7d8d85592b38f08981d7cef375f2860" Nov 29 06:45:43 crc kubenswrapper[4594]: I1129 06:45:43.016280 4594 scope.go:117] "RemoveContainer" containerID="7175a03672eec6b008bf44a0701c9edcb72c9c1be5c592d78b1714423a7d2c88" Nov 29 06:45:43 crc kubenswrapper[4594]: E1129 06:45:43.017127 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7175a03672eec6b008bf44a0701c9edcb72c9c1be5c592d78b1714423a7d2c88\": container with ID starting with 7175a03672eec6b008bf44a0701c9edcb72c9c1be5c592d78b1714423a7d2c88 not found: ID does not exist" containerID="7175a03672eec6b008bf44a0701c9edcb72c9c1be5c592d78b1714423a7d2c88" Nov 29 06:45:43 crc kubenswrapper[4594]: I1129 06:45:43.017160 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7175a03672eec6b008bf44a0701c9edcb72c9c1be5c592d78b1714423a7d2c88"} err="failed to get container status \"7175a03672eec6b008bf44a0701c9edcb72c9c1be5c592d78b1714423a7d2c88\": rpc error: code = NotFound desc = could not find container \"7175a03672eec6b008bf44a0701c9edcb72c9c1be5c592d78b1714423a7d2c88\": container with ID starting with 7175a03672eec6b008bf44a0701c9edcb72c9c1be5c592d78b1714423a7d2c88 not found: ID does not exist" Nov 29 06:45:43 crc kubenswrapper[4594]: I1129 06:45:43.017216 4594 scope.go:117] "RemoveContainer" containerID="9ff0d7c88661cf1adbb32216d56dfc0de7d8d85592b38f08981d7cef375f2860" Nov 29 06:45:43 crc kubenswrapper[4594]: E1129 06:45:43.017829 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ff0d7c88661cf1adbb32216d56dfc0de7d8d85592b38f08981d7cef375f2860\": container with ID starting with 9ff0d7c88661cf1adbb32216d56dfc0de7d8d85592b38f08981d7cef375f2860 not found: ID does not exist" containerID="9ff0d7c88661cf1adbb32216d56dfc0de7d8d85592b38f08981d7cef375f2860" Nov 29 06:45:43 crc kubenswrapper[4594]: I1129 06:45:43.017877 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ff0d7c88661cf1adbb32216d56dfc0de7d8d85592b38f08981d7cef375f2860"} err="failed to get container status \"9ff0d7c88661cf1adbb32216d56dfc0de7d8d85592b38f08981d7cef375f2860\": rpc error: code = NotFound desc = could not find container \"9ff0d7c88661cf1adbb32216d56dfc0de7d8d85592b38f08981d7cef375f2860\": container with ID starting with 9ff0d7c88661cf1adbb32216d56dfc0de7d8d85592b38f08981d7cef375f2860 not found: ID does not exist" Nov 29 06:45:44 crc kubenswrapper[4594]: I1129 06:45:44.105634 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70da90c5-e357-44f5-8f87-d2b63c1edc68" path="/var/lib/kubelet/pods/70da90c5-e357-44f5-8f87-d2b63c1edc68/volumes" Nov 29 06:45:51 crc kubenswrapper[4594]: I1129 06:45:51.083363 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:45:51 crc kubenswrapper[4594]: E1129 06:45:51.084094 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:46:03 crc kubenswrapper[4594]: I1129 06:46:03.083356 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:46:03 crc kubenswrapper[4594]: E1129 06:46:03.084363 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:46:16 crc kubenswrapper[4594]: I1129 06:46:16.091155 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:46:16 crc kubenswrapper[4594]: E1129 06:46:16.092185 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:46:29 crc kubenswrapper[4594]: I1129 06:46:29.083476 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:46:29 crc kubenswrapper[4594]: E1129 06:46:29.084449 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:46:42 crc kubenswrapper[4594]: I1129 06:46:42.083786 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:46:42 crc kubenswrapper[4594]: E1129 06:46:42.084658 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:46:53 crc kubenswrapper[4594]: I1129 06:46:53.083230 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:46:53 crc kubenswrapper[4594]: E1129 06:46:53.085293 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:47:04 crc kubenswrapper[4594]: I1129 06:47:04.084683 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:47:04 crc kubenswrapper[4594]: E1129 06:47:04.086100 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:47:11 crc kubenswrapper[4594]: I1129 06:47:11.317051 4594 scope.go:117] "RemoveContainer" containerID="d12281137fdfebb1cd0a2f8a0ef86fca8db90a116c76c35fb119bd8816bce66d" Nov 29 06:47:16 crc kubenswrapper[4594]: I1129 06:47:16.088740 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:47:16 crc kubenswrapper[4594]: E1129 06:47:16.090677 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:47:27 crc kubenswrapper[4594]: I1129 06:47:27.084709 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:47:27 crc kubenswrapper[4594]: E1129 06:47:27.085793 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:47:28 crc kubenswrapper[4594]: I1129 06:47:28.933110 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9fz4d"] Nov 29 06:47:28 crc kubenswrapper[4594]: E1129 06:47:28.933883 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af72438-75a5-4e7c-b918-feb6653dd2b8" containerName="collect-profiles" Nov 29 06:47:28 crc kubenswrapper[4594]: I1129 06:47:28.933898 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af72438-75a5-4e7c-b918-feb6653dd2b8" containerName="collect-profiles" Nov 29 06:47:28 crc kubenswrapper[4594]: E1129 06:47:28.933919 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70da90c5-e357-44f5-8f87-d2b63c1edc68" containerName="gather" Nov 29 06:47:28 crc kubenswrapper[4594]: I1129 06:47:28.933924 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="70da90c5-e357-44f5-8f87-d2b63c1edc68" containerName="gather" Nov 29 06:47:28 crc kubenswrapper[4594]: E1129 06:47:28.933949 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70da90c5-e357-44f5-8f87-d2b63c1edc68" containerName="copy" Nov 29 06:47:28 crc kubenswrapper[4594]: I1129 06:47:28.933955 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="70da90c5-e357-44f5-8f87-d2b63c1edc68" containerName="copy" Nov 29 06:47:28 crc kubenswrapper[4594]: I1129 06:47:28.934192 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="70da90c5-e357-44f5-8f87-d2b63c1edc68" containerName="copy" Nov 29 06:47:28 crc kubenswrapper[4594]: I1129 06:47:28.934206 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="7af72438-75a5-4e7c-b918-feb6653dd2b8" containerName="collect-profiles" Nov 29 06:47:28 crc kubenswrapper[4594]: I1129 06:47:28.934227 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="70da90c5-e357-44f5-8f87-d2b63c1edc68" containerName="gather" Nov 29 06:47:28 crc kubenswrapper[4594]: I1129 06:47:28.935634 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:28 crc kubenswrapper[4594]: I1129 06:47:28.944780 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fz4d"] Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.037448 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tgs2\" (UniqueName: \"kubernetes.io/projected/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-kube-api-access-2tgs2\") pod \"community-operators-9fz4d\" (UID: \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\") " pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.037627 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-utilities\") pod \"community-operators-9fz4d\" (UID: \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\") " pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.037788 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-catalog-content\") pod \"community-operators-9fz4d\" (UID: \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\") " pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.139137 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tgs2\" (UniqueName: \"kubernetes.io/projected/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-kube-api-access-2tgs2\") pod \"community-operators-9fz4d\" (UID: \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\") " pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.140343 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-utilities\") pod \"community-operators-9fz4d\" (UID: \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\") " pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.140513 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-catalog-content\") pod \"community-operators-9fz4d\" (UID: \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\") " pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.140978 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-catalog-content\") pod \"community-operators-9fz4d\" (UID: \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\") " pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.141445 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-utilities\") pod \"community-operators-9fz4d\" (UID: \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\") " pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.158069 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tgs2\" (UniqueName: \"kubernetes.io/projected/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-kube-api-access-2tgs2\") pod \"community-operators-9fz4d\" (UID: \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\") " pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.256088 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.715801 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fz4d"] Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.955271 4594 generic.go:334] "Generic (PLEG): container finished" podID="c7d5f792-3996-4a26-b0cf-16dfe461f2a1" containerID="15a3b3e2c7327a29832a06ea8cc96a5a18a629bb96b3cbb55664a4dadede90eb" exitCode=0 Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.955319 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fz4d" event={"ID":"c7d5f792-3996-4a26-b0cf-16dfe461f2a1","Type":"ContainerDied","Data":"15a3b3e2c7327a29832a06ea8cc96a5a18a629bb96b3cbb55664a4dadede90eb"} Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.955378 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fz4d" event={"ID":"c7d5f792-3996-4a26-b0cf-16dfe461f2a1","Type":"ContainerStarted","Data":"529eeacef0e945ae6b2917e05d22c5e8b92767a6732f5b4bbd84d088054ed13b"} Nov 29 06:47:29 crc kubenswrapper[4594]: I1129 06:47:29.957933 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 06:47:30 crc kubenswrapper[4594]: I1129 06:47:30.965930 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fz4d" event={"ID":"c7d5f792-3996-4a26-b0cf-16dfe461f2a1","Type":"ContainerStarted","Data":"95fb6b0a6c4b35b7187d645d3eb6cd132e039f1c2fe53274c8c1d61974dc0e9e"} Nov 29 06:47:31 crc kubenswrapper[4594]: I1129 06:47:31.975878 4594 generic.go:334] "Generic (PLEG): container finished" podID="c7d5f792-3996-4a26-b0cf-16dfe461f2a1" containerID="95fb6b0a6c4b35b7187d645d3eb6cd132e039f1c2fe53274c8c1d61974dc0e9e" exitCode=0 Nov 29 06:47:31 crc kubenswrapper[4594]: I1129 06:47:31.975931 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fz4d" event={"ID":"c7d5f792-3996-4a26-b0cf-16dfe461f2a1","Type":"ContainerDied","Data":"95fb6b0a6c4b35b7187d645d3eb6cd132e039f1c2fe53274c8c1d61974dc0e9e"} Nov 29 06:47:33 crc kubenswrapper[4594]: I1129 06:47:33.998377 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fz4d" event={"ID":"c7d5f792-3996-4a26-b0cf-16dfe461f2a1","Type":"ContainerStarted","Data":"cfdd558680d471fdc9b355dd4498b72fb245661321f0d58afd8eaa1d48f128eb"} Nov 29 06:47:34 crc kubenswrapper[4594]: I1129 06:47:34.021291 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9fz4d" podStartSLOduration=3.4315045 podStartE2EDuration="6.021272101s" podCreationTimestamp="2025-11-29 06:47:28 +0000 UTC" firstStartedPulling="2025-11-29 06:47:29.957677922 +0000 UTC m=+4774.198187142" lastFinishedPulling="2025-11-29 06:47:32.547445524 +0000 UTC m=+4776.787954743" observedRunningTime="2025-11-29 06:47:34.015828408 +0000 UTC m=+4778.256337628" watchObservedRunningTime="2025-11-29 06:47:34.021272101 +0000 UTC m=+4778.261781321" Nov 29 06:47:39 crc kubenswrapper[4594]: I1129 06:47:39.256744 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:39 crc kubenswrapper[4594]: I1129 06:47:39.257342 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:39 crc kubenswrapper[4594]: I1129 06:47:39.297170 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:40 crc kubenswrapper[4594]: I1129 06:47:40.084454 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:47:40 crc kubenswrapper[4594]: E1129 06:47:40.085096 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:47:40 crc kubenswrapper[4594]: I1129 06:47:40.094677 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:40 crc kubenswrapper[4594]: I1129 06:47:40.137702 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9fz4d"] Nov 29 06:47:42 crc kubenswrapper[4594]: I1129 06:47:42.072037 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9fz4d" podUID="c7d5f792-3996-4a26-b0cf-16dfe461f2a1" containerName="registry-server" containerID="cri-o://cfdd558680d471fdc9b355dd4498b72fb245661321f0d58afd8eaa1d48f128eb" gracePeriod=2 Nov 29 06:47:42 crc kubenswrapper[4594]: I1129 06:47:42.490243 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:42 crc kubenswrapper[4594]: I1129 06:47:42.539573 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tgs2\" (UniqueName: \"kubernetes.io/projected/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-kube-api-access-2tgs2\") pod \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\" (UID: \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\") " Nov 29 06:47:42 crc kubenswrapper[4594]: I1129 06:47:42.539775 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-utilities\") pod \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\" (UID: \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\") " Nov 29 06:47:42 crc kubenswrapper[4594]: I1129 06:47:42.539847 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-catalog-content\") pod \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\" (UID: \"c7d5f792-3996-4a26-b0cf-16dfe461f2a1\") " Nov 29 06:47:42 crc kubenswrapper[4594]: I1129 06:47:42.540826 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-utilities" (OuterVolumeSpecName: "utilities") pod "c7d5f792-3996-4a26-b0cf-16dfe461f2a1" (UID: "c7d5f792-3996-4a26-b0cf-16dfe461f2a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:47:42 crc kubenswrapper[4594]: I1129 06:47:42.541039 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:47:42 crc kubenswrapper[4594]: I1129 06:47:42.549030 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-kube-api-access-2tgs2" (OuterVolumeSpecName: "kube-api-access-2tgs2") pod "c7d5f792-3996-4a26-b0cf-16dfe461f2a1" (UID: "c7d5f792-3996-4a26-b0cf-16dfe461f2a1"). InnerVolumeSpecName "kube-api-access-2tgs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:47:42 crc kubenswrapper[4594]: I1129 06:47:42.587191 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7d5f792-3996-4a26-b0cf-16dfe461f2a1" (UID: "c7d5f792-3996-4a26-b0cf-16dfe461f2a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:47:42 crc kubenswrapper[4594]: I1129 06:47:42.642420 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tgs2\" (UniqueName: \"kubernetes.io/projected/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-kube-api-access-2tgs2\") on node \"crc\" DevicePath \"\"" Nov 29 06:47:42 crc kubenswrapper[4594]: I1129 06:47:42.642581 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d5f792-3996-4a26-b0cf-16dfe461f2a1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.087915 4594 generic.go:334] "Generic (PLEG): container finished" podID="c7d5f792-3996-4a26-b0cf-16dfe461f2a1" containerID="cfdd558680d471fdc9b355dd4498b72fb245661321f0d58afd8eaa1d48f128eb" exitCode=0 Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.088000 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fz4d" Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.088017 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fz4d" event={"ID":"c7d5f792-3996-4a26-b0cf-16dfe461f2a1","Type":"ContainerDied","Data":"cfdd558680d471fdc9b355dd4498b72fb245661321f0d58afd8eaa1d48f128eb"} Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.088421 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fz4d" event={"ID":"c7d5f792-3996-4a26-b0cf-16dfe461f2a1","Type":"ContainerDied","Data":"529eeacef0e945ae6b2917e05d22c5e8b92767a6732f5b4bbd84d088054ed13b"} Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.088464 4594 scope.go:117] "RemoveContainer" containerID="cfdd558680d471fdc9b355dd4498b72fb245661321f0d58afd8eaa1d48f128eb" Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.120328 4594 scope.go:117] "RemoveContainer" containerID="95fb6b0a6c4b35b7187d645d3eb6cd132e039f1c2fe53274c8c1d61974dc0e9e" Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.128851 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9fz4d"] Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.136719 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9fz4d"] Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.138361 4594 scope.go:117] "RemoveContainer" containerID="15a3b3e2c7327a29832a06ea8cc96a5a18a629bb96b3cbb55664a4dadede90eb" Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.174469 4594 scope.go:117] "RemoveContainer" containerID="cfdd558680d471fdc9b355dd4498b72fb245661321f0d58afd8eaa1d48f128eb" Nov 29 06:47:43 crc kubenswrapper[4594]: E1129 06:47:43.174761 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfdd558680d471fdc9b355dd4498b72fb245661321f0d58afd8eaa1d48f128eb\": container with ID starting with cfdd558680d471fdc9b355dd4498b72fb245661321f0d58afd8eaa1d48f128eb not found: ID does not exist" containerID="cfdd558680d471fdc9b355dd4498b72fb245661321f0d58afd8eaa1d48f128eb" Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.174796 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfdd558680d471fdc9b355dd4498b72fb245661321f0d58afd8eaa1d48f128eb"} err="failed to get container status \"cfdd558680d471fdc9b355dd4498b72fb245661321f0d58afd8eaa1d48f128eb\": rpc error: code = NotFound desc = could not find container \"cfdd558680d471fdc9b355dd4498b72fb245661321f0d58afd8eaa1d48f128eb\": container with ID starting with cfdd558680d471fdc9b355dd4498b72fb245661321f0d58afd8eaa1d48f128eb not found: ID does not exist" Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.174818 4594 scope.go:117] "RemoveContainer" containerID="95fb6b0a6c4b35b7187d645d3eb6cd132e039f1c2fe53274c8c1d61974dc0e9e" Nov 29 06:47:43 crc kubenswrapper[4594]: E1129 06:47:43.175128 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95fb6b0a6c4b35b7187d645d3eb6cd132e039f1c2fe53274c8c1d61974dc0e9e\": container with ID starting with 95fb6b0a6c4b35b7187d645d3eb6cd132e039f1c2fe53274c8c1d61974dc0e9e not found: ID does not exist" containerID="95fb6b0a6c4b35b7187d645d3eb6cd132e039f1c2fe53274c8c1d61974dc0e9e" Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.175150 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95fb6b0a6c4b35b7187d645d3eb6cd132e039f1c2fe53274c8c1d61974dc0e9e"} err="failed to get container status \"95fb6b0a6c4b35b7187d645d3eb6cd132e039f1c2fe53274c8c1d61974dc0e9e\": rpc error: code = NotFound desc = could not find container \"95fb6b0a6c4b35b7187d645d3eb6cd132e039f1c2fe53274c8c1d61974dc0e9e\": container with ID starting with 95fb6b0a6c4b35b7187d645d3eb6cd132e039f1c2fe53274c8c1d61974dc0e9e not found: ID does not exist" Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.175168 4594 scope.go:117] "RemoveContainer" containerID="15a3b3e2c7327a29832a06ea8cc96a5a18a629bb96b3cbb55664a4dadede90eb" Nov 29 06:47:43 crc kubenswrapper[4594]: E1129 06:47:43.175544 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15a3b3e2c7327a29832a06ea8cc96a5a18a629bb96b3cbb55664a4dadede90eb\": container with ID starting with 15a3b3e2c7327a29832a06ea8cc96a5a18a629bb96b3cbb55664a4dadede90eb not found: ID does not exist" containerID="15a3b3e2c7327a29832a06ea8cc96a5a18a629bb96b3cbb55664a4dadede90eb" Nov 29 06:47:43 crc kubenswrapper[4594]: I1129 06:47:43.175569 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15a3b3e2c7327a29832a06ea8cc96a5a18a629bb96b3cbb55664a4dadede90eb"} err="failed to get container status \"15a3b3e2c7327a29832a06ea8cc96a5a18a629bb96b3cbb55664a4dadede90eb\": rpc error: code = NotFound desc = could not find container \"15a3b3e2c7327a29832a06ea8cc96a5a18a629bb96b3cbb55664a4dadede90eb\": container with ID starting with 15a3b3e2c7327a29832a06ea8cc96a5a18a629bb96b3cbb55664a4dadede90eb not found: ID does not exist" Nov 29 06:47:44 crc kubenswrapper[4594]: I1129 06:47:44.092099 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d5f792-3996-4a26-b0cf-16dfe461f2a1" path="/var/lib/kubelet/pods/c7d5f792-3996-4a26-b0cf-16dfe461f2a1/volumes" Nov 29 06:47:54 crc kubenswrapper[4594]: I1129 06:47:54.084504 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:47:54 crc kubenswrapper[4594]: E1129 06:47:54.085214 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.163966 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w4njb/must-gather-7grvm"] Nov 29 06:48:04 crc kubenswrapper[4594]: E1129 06:48:04.169594 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d5f792-3996-4a26-b0cf-16dfe461f2a1" containerName="extract-content" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.169611 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d5f792-3996-4a26-b0cf-16dfe461f2a1" containerName="extract-content" Nov 29 06:48:04 crc kubenswrapper[4594]: E1129 06:48:04.169627 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d5f792-3996-4a26-b0cf-16dfe461f2a1" containerName="registry-server" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.169633 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d5f792-3996-4a26-b0cf-16dfe461f2a1" containerName="registry-server" Nov 29 06:48:04 crc kubenswrapper[4594]: E1129 06:48:04.169668 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d5f792-3996-4a26-b0cf-16dfe461f2a1" containerName="extract-utilities" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.169675 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d5f792-3996-4a26-b0cf-16dfe461f2a1" containerName="extract-utilities" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.169884 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d5f792-3996-4a26-b0cf-16dfe461f2a1" containerName="registry-server" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.171045 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/must-gather-7grvm" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.174110 4594 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-w4njb"/"default-dockercfg-fg94j" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.174386 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-w4njb"/"kube-root-ca.crt" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.174578 4594 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-w4njb"/"openshift-service-ca.crt" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.185910 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w4njb/must-gather-7grvm"] Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.210245 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9ad12f5-f7c6-43ed-9192-e963371115d4-must-gather-output\") pod \"must-gather-7grvm\" (UID: \"f9ad12f5-f7c6-43ed-9192-e963371115d4\") " pod="openshift-must-gather-w4njb/must-gather-7grvm" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.210455 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52z45\" (UniqueName: \"kubernetes.io/projected/f9ad12f5-f7c6-43ed-9192-e963371115d4-kube-api-access-52z45\") pod \"must-gather-7grvm\" (UID: \"f9ad12f5-f7c6-43ed-9192-e963371115d4\") " pod="openshift-must-gather-w4njb/must-gather-7grvm" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.313463 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9ad12f5-f7c6-43ed-9192-e963371115d4-must-gather-output\") pod \"must-gather-7grvm\" (UID: \"f9ad12f5-f7c6-43ed-9192-e963371115d4\") " pod="openshift-must-gather-w4njb/must-gather-7grvm" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.313840 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52z45\" (UniqueName: \"kubernetes.io/projected/f9ad12f5-f7c6-43ed-9192-e963371115d4-kube-api-access-52z45\") pod \"must-gather-7grvm\" (UID: \"f9ad12f5-f7c6-43ed-9192-e963371115d4\") " pod="openshift-must-gather-w4njb/must-gather-7grvm" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.313948 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9ad12f5-f7c6-43ed-9192-e963371115d4-must-gather-output\") pod \"must-gather-7grvm\" (UID: \"f9ad12f5-f7c6-43ed-9192-e963371115d4\") " pod="openshift-must-gather-w4njb/must-gather-7grvm" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.331227 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52z45\" (UniqueName: \"kubernetes.io/projected/f9ad12f5-f7c6-43ed-9192-e963371115d4-kube-api-access-52z45\") pod \"must-gather-7grvm\" (UID: \"f9ad12f5-f7c6-43ed-9192-e963371115d4\") " pod="openshift-must-gather-w4njb/must-gather-7grvm" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.489907 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/must-gather-7grvm" Nov 29 06:48:04 crc kubenswrapper[4594]: I1129 06:48:04.954862 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w4njb/must-gather-7grvm"] Nov 29 06:48:05 crc kubenswrapper[4594]: I1129 06:48:05.084580 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:48:05 crc kubenswrapper[4594]: E1129 06:48:05.085397 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:48:05 crc kubenswrapper[4594]: I1129 06:48:05.303564 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w4njb/must-gather-7grvm" event={"ID":"f9ad12f5-f7c6-43ed-9192-e963371115d4","Type":"ContainerStarted","Data":"404ba45c83a5184bc822c6899fa0a86965a5eab8a40e8e8cf91f5e87e1aaef69"} Nov 29 06:48:05 crc kubenswrapper[4594]: I1129 06:48:05.303618 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w4njb/must-gather-7grvm" event={"ID":"f9ad12f5-f7c6-43ed-9192-e963371115d4","Type":"ContainerStarted","Data":"03a63c1ece077ebc1d2a328c60c8dc1b26819ee026cd72b994380b68b49a05d8"} Nov 29 06:48:06 crc kubenswrapper[4594]: I1129 06:48:06.326050 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w4njb/must-gather-7grvm" event={"ID":"f9ad12f5-f7c6-43ed-9192-e963371115d4","Type":"ContainerStarted","Data":"14c472929807c1e5ba19924028f467cbe0fad82d928f1f198ba5d453e681e4e7"} Nov 29 06:48:06 crc kubenswrapper[4594]: I1129 06:48:06.345680 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w4njb/must-gather-7grvm" podStartSLOduration=2.345661201 podStartE2EDuration="2.345661201s" podCreationTimestamp="2025-11-29 06:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 06:48:06.340642916 +0000 UTC m=+4810.581152137" watchObservedRunningTime="2025-11-29 06:48:06.345661201 +0000 UTC m=+4810.586170420" Nov 29 06:48:08 crc kubenswrapper[4594]: I1129 06:48:08.816872 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w4njb/crc-debug-kk2j5"] Nov 29 06:48:08 crc kubenswrapper[4594]: I1129 06:48:08.821321 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/crc-debug-kk2j5" Nov 29 06:48:08 crc kubenswrapper[4594]: I1129 06:48:08.924373 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxl6j\" (UniqueName: \"kubernetes.io/projected/dfcfb019-3151-40ee-8496-5770d1b5a451-kube-api-access-xxl6j\") pod \"crc-debug-kk2j5\" (UID: \"dfcfb019-3151-40ee-8496-5770d1b5a451\") " pod="openshift-must-gather-w4njb/crc-debug-kk2j5" Nov 29 06:48:08 crc kubenswrapper[4594]: I1129 06:48:08.924554 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfcfb019-3151-40ee-8496-5770d1b5a451-host\") pod \"crc-debug-kk2j5\" (UID: \"dfcfb019-3151-40ee-8496-5770d1b5a451\") " pod="openshift-must-gather-w4njb/crc-debug-kk2j5" Nov 29 06:48:09 crc kubenswrapper[4594]: I1129 06:48:09.026281 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxl6j\" (UniqueName: \"kubernetes.io/projected/dfcfb019-3151-40ee-8496-5770d1b5a451-kube-api-access-xxl6j\") pod \"crc-debug-kk2j5\" (UID: \"dfcfb019-3151-40ee-8496-5770d1b5a451\") " pod="openshift-must-gather-w4njb/crc-debug-kk2j5" Nov 29 06:48:09 crc kubenswrapper[4594]: I1129 06:48:09.026378 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfcfb019-3151-40ee-8496-5770d1b5a451-host\") pod \"crc-debug-kk2j5\" (UID: \"dfcfb019-3151-40ee-8496-5770d1b5a451\") " pod="openshift-must-gather-w4njb/crc-debug-kk2j5" Nov 29 06:48:09 crc kubenswrapper[4594]: I1129 06:48:09.026511 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfcfb019-3151-40ee-8496-5770d1b5a451-host\") pod \"crc-debug-kk2j5\" (UID: \"dfcfb019-3151-40ee-8496-5770d1b5a451\") " pod="openshift-must-gather-w4njb/crc-debug-kk2j5" Nov 29 06:48:09 crc kubenswrapper[4594]: I1129 06:48:09.309248 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxl6j\" (UniqueName: \"kubernetes.io/projected/dfcfb019-3151-40ee-8496-5770d1b5a451-kube-api-access-xxl6j\") pod \"crc-debug-kk2j5\" (UID: \"dfcfb019-3151-40ee-8496-5770d1b5a451\") " pod="openshift-must-gather-w4njb/crc-debug-kk2j5" Nov 29 06:48:09 crc kubenswrapper[4594]: I1129 06:48:09.449447 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/crc-debug-kk2j5" Nov 29 06:48:10 crc kubenswrapper[4594]: I1129 06:48:10.360035 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w4njb/crc-debug-kk2j5" event={"ID":"dfcfb019-3151-40ee-8496-5770d1b5a451","Type":"ContainerStarted","Data":"fdbbcb2c54fda25f215343a7e9093fdad28783a094b22d0974ec56ea5c149d23"} Nov 29 06:48:10 crc kubenswrapper[4594]: I1129 06:48:10.361176 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w4njb/crc-debug-kk2j5" event={"ID":"dfcfb019-3151-40ee-8496-5770d1b5a451","Type":"ContainerStarted","Data":"a83d219381cb0e2ceaa83182db428e6e2cbadbaac73d1569772626ac6a89897b"} Nov 29 06:48:10 crc kubenswrapper[4594]: I1129 06:48:10.383646 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w4njb/crc-debug-kk2j5" podStartSLOduration=2.3836260989999998 podStartE2EDuration="2.383626099s" podCreationTimestamp="2025-11-29 06:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 06:48:10.374715479 +0000 UTC m=+4814.615224698" watchObservedRunningTime="2025-11-29 06:48:10.383626099 +0000 UTC m=+4814.624135319" Nov 29 06:48:11 crc kubenswrapper[4594]: I1129 06:48:11.953507 4594 scope.go:117] "RemoveContainer" containerID="fc1d4b6b40c74504d9bb87ad23c8ba1bbfca64e71d687766d7f89bb8dfa7a7d0" Nov 29 06:48:19 crc kubenswrapper[4594]: I1129 06:48:19.083693 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:48:19 crc kubenswrapper[4594]: E1129 06:48:19.084361 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:48:31 crc kubenswrapper[4594]: I1129 06:48:31.083846 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:48:31 crc kubenswrapper[4594]: E1129 06:48:31.084811 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:48:42 crc kubenswrapper[4594]: I1129 06:48:42.083074 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:48:42 crc kubenswrapper[4594]: E1129 06:48:42.084003 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:48:45 crc kubenswrapper[4594]: I1129 06:48:45.709559 4594 generic.go:334] "Generic (PLEG): container finished" podID="dfcfb019-3151-40ee-8496-5770d1b5a451" containerID="fdbbcb2c54fda25f215343a7e9093fdad28783a094b22d0974ec56ea5c149d23" exitCode=0 Nov 29 06:48:45 crc kubenswrapper[4594]: I1129 06:48:45.709666 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w4njb/crc-debug-kk2j5" event={"ID":"dfcfb019-3151-40ee-8496-5770d1b5a451","Type":"ContainerDied","Data":"fdbbcb2c54fda25f215343a7e9093fdad28783a094b22d0974ec56ea5c149d23"} Nov 29 06:48:47 crc kubenswrapper[4594]: I1129 06:48:47.396969 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/crc-debug-kk2j5" Nov 29 06:48:47 crc kubenswrapper[4594]: I1129 06:48:47.427426 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w4njb/crc-debug-kk2j5"] Nov 29 06:48:47 crc kubenswrapper[4594]: I1129 06:48:47.433469 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w4njb/crc-debug-kk2j5"] Nov 29 06:48:47 crc kubenswrapper[4594]: I1129 06:48:47.535810 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxl6j\" (UniqueName: \"kubernetes.io/projected/dfcfb019-3151-40ee-8496-5770d1b5a451-kube-api-access-xxl6j\") pod \"dfcfb019-3151-40ee-8496-5770d1b5a451\" (UID: \"dfcfb019-3151-40ee-8496-5770d1b5a451\") " Nov 29 06:48:47 crc kubenswrapper[4594]: I1129 06:48:47.535993 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfcfb019-3151-40ee-8496-5770d1b5a451-host\") pod \"dfcfb019-3151-40ee-8496-5770d1b5a451\" (UID: \"dfcfb019-3151-40ee-8496-5770d1b5a451\") " Nov 29 06:48:47 crc kubenswrapper[4594]: I1129 06:48:47.536347 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dfcfb019-3151-40ee-8496-5770d1b5a451-host" (OuterVolumeSpecName: "host") pod "dfcfb019-3151-40ee-8496-5770d1b5a451" (UID: "dfcfb019-3151-40ee-8496-5770d1b5a451"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 06:48:47 crc kubenswrapper[4594]: I1129 06:48:47.537201 4594 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfcfb019-3151-40ee-8496-5770d1b5a451-host\") on node \"crc\" DevicePath \"\"" Nov 29 06:48:47 crc kubenswrapper[4594]: I1129 06:48:47.542565 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfcfb019-3151-40ee-8496-5770d1b5a451-kube-api-access-xxl6j" (OuterVolumeSpecName: "kube-api-access-xxl6j") pod "dfcfb019-3151-40ee-8496-5770d1b5a451" (UID: "dfcfb019-3151-40ee-8496-5770d1b5a451"). InnerVolumeSpecName "kube-api-access-xxl6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:48:47 crc kubenswrapper[4594]: I1129 06:48:47.640100 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxl6j\" (UniqueName: \"kubernetes.io/projected/dfcfb019-3151-40ee-8496-5770d1b5a451-kube-api-access-xxl6j\") on node \"crc\" DevicePath \"\"" Nov 29 06:48:47 crc kubenswrapper[4594]: I1129 06:48:47.729847 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a83d219381cb0e2ceaa83182db428e6e2cbadbaac73d1569772626ac6a89897b" Nov 29 06:48:47 crc kubenswrapper[4594]: I1129 06:48:47.729913 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/crc-debug-kk2j5" Nov 29 06:48:48 crc kubenswrapper[4594]: I1129 06:48:48.093614 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfcfb019-3151-40ee-8496-5770d1b5a451" path="/var/lib/kubelet/pods/dfcfb019-3151-40ee-8496-5770d1b5a451/volumes" Nov 29 06:48:48 crc kubenswrapper[4594]: I1129 06:48:48.646277 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w4njb/crc-debug-zp9lg"] Nov 29 06:48:48 crc kubenswrapper[4594]: E1129 06:48:48.646647 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfcfb019-3151-40ee-8496-5770d1b5a451" containerName="container-00" Nov 29 06:48:48 crc kubenswrapper[4594]: I1129 06:48:48.646661 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfcfb019-3151-40ee-8496-5770d1b5a451" containerName="container-00" Nov 29 06:48:48 crc kubenswrapper[4594]: I1129 06:48:48.646850 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfcfb019-3151-40ee-8496-5770d1b5a451" containerName="container-00" Nov 29 06:48:48 crc kubenswrapper[4594]: I1129 06:48:48.647468 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/crc-debug-zp9lg" Nov 29 06:48:48 crc kubenswrapper[4594]: I1129 06:48:48.762312 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8f6f410-bfc3-4310-8547-a2d32ee42c93-host\") pod \"crc-debug-zp9lg\" (UID: \"d8f6f410-bfc3-4310-8547-a2d32ee42c93\") " pod="openshift-must-gather-w4njb/crc-debug-zp9lg" Nov 29 06:48:48 crc kubenswrapper[4594]: I1129 06:48:48.762780 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwlkk\" (UniqueName: \"kubernetes.io/projected/d8f6f410-bfc3-4310-8547-a2d32ee42c93-kube-api-access-rwlkk\") pod \"crc-debug-zp9lg\" (UID: \"d8f6f410-bfc3-4310-8547-a2d32ee42c93\") " pod="openshift-must-gather-w4njb/crc-debug-zp9lg" Nov 29 06:48:48 crc kubenswrapper[4594]: I1129 06:48:48.864841 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwlkk\" (UniqueName: \"kubernetes.io/projected/d8f6f410-bfc3-4310-8547-a2d32ee42c93-kube-api-access-rwlkk\") pod \"crc-debug-zp9lg\" (UID: \"d8f6f410-bfc3-4310-8547-a2d32ee42c93\") " pod="openshift-must-gather-w4njb/crc-debug-zp9lg" Nov 29 06:48:48 crc kubenswrapper[4594]: I1129 06:48:48.864970 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8f6f410-bfc3-4310-8547-a2d32ee42c93-host\") pod \"crc-debug-zp9lg\" (UID: \"d8f6f410-bfc3-4310-8547-a2d32ee42c93\") " pod="openshift-must-gather-w4njb/crc-debug-zp9lg" Nov 29 06:48:48 crc kubenswrapper[4594]: I1129 06:48:48.865051 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8f6f410-bfc3-4310-8547-a2d32ee42c93-host\") pod \"crc-debug-zp9lg\" (UID: \"d8f6f410-bfc3-4310-8547-a2d32ee42c93\") " pod="openshift-must-gather-w4njb/crc-debug-zp9lg" Nov 29 06:48:48 crc kubenswrapper[4594]: I1129 06:48:48.890639 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwlkk\" (UniqueName: \"kubernetes.io/projected/d8f6f410-bfc3-4310-8547-a2d32ee42c93-kube-api-access-rwlkk\") pod \"crc-debug-zp9lg\" (UID: \"d8f6f410-bfc3-4310-8547-a2d32ee42c93\") " pod="openshift-must-gather-w4njb/crc-debug-zp9lg" Nov 29 06:48:48 crc kubenswrapper[4594]: I1129 06:48:48.963459 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/crc-debug-zp9lg" Nov 29 06:48:48 crc kubenswrapper[4594]: W1129 06:48:48.993180 4594 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8f6f410_bfc3_4310_8547_a2d32ee42c93.slice/crio-3c8c7b0a4b19f7f3e75d9872b53a82c6cebfeabcd5bc10c68339398d65e737df WatchSource:0}: Error finding container 3c8c7b0a4b19f7f3e75d9872b53a82c6cebfeabcd5bc10c68339398d65e737df: Status 404 returned error can't find the container with id 3c8c7b0a4b19f7f3e75d9872b53a82c6cebfeabcd5bc10c68339398d65e737df Nov 29 06:48:49 crc kubenswrapper[4594]: I1129 06:48:49.758044 4594 generic.go:334] "Generic (PLEG): container finished" podID="d8f6f410-bfc3-4310-8547-a2d32ee42c93" containerID="28e0ff54182c6f22c89f731b6b54bdc2364fef91618c7fbd173da822e4fee9c6" exitCode=0 Nov 29 06:48:49 crc kubenswrapper[4594]: I1129 06:48:49.758118 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w4njb/crc-debug-zp9lg" event={"ID":"d8f6f410-bfc3-4310-8547-a2d32ee42c93","Type":"ContainerDied","Data":"28e0ff54182c6f22c89f731b6b54bdc2364fef91618c7fbd173da822e4fee9c6"} Nov 29 06:48:49 crc kubenswrapper[4594]: I1129 06:48:49.758169 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w4njb/crc-debug-zp9lg" event={"ID":"d8f6f410-bfc3-4310-8547-a2d32ee42c93","Type":"ContainerStarted","Data":"3c8c7b0a4b19f7f3e75d9872b53a82c6cebfeabcd5bc10c68339398d65e737df"} Nov 29 06:48:50 crc kubenswrapper[4594]: I1129 06:48:50.880088 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/crc-debug-zp9lg" Nov 29 06:48:51 crc kubenswrapper[4594]: I1129 06:48:51.016578 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwlkk\" (UniqueName: \"kubernetes.io/projected/d8f6f410-bfc3-4310-8547-a2d32ee42c93-kube-api-access-rwlkk\") pod \"d8f6f410-bfc3-4310-8547-a2d32ee42c93\" (UID: \"d8f6f410-bfc3-4310-8547-a2d32ee42c93\") " Nov 29 06:48:51 crc kubenswrapper[4594]: I1129 06:48:51.016689 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8f6f410-bfc3-4310-8547-a2d32ee42c93-host\") pod \"d8f6f410-bfc3-4310-8547-a2d32ee42c93\" (UID: \"d8f6f410-bfc3-4310-8547-a2d32ee42c93\") " Nov 29 06:48:51 crc kubenswrapper[4594]: I1129 06:48:51.017299 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8f6f410-bfc3-4310-8547-a2d32ee42c93-host" (OuterVolumeSpecName: "host") pod "d8f6f410-bfc3-4310-8547-a2d32ee42c93" (UID: "d8f6f410-bfc3-4310-8547-a2d32ee42c93"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 06:48:51 crc kubenswrapper[4594]: I1129 06:48:51.017569 4594 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8f6f410-bfc3-4310-8547-a2d32ee42c93-host\") on node \"crc\" DevicePath \"\"" Nov 29 06:48:51 crc kubenswrapper[4594]: I1129 06:48:51.023170 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f6f410-bfc3-4310-8547-a2d32ee42c93-kube-api-access-rwlkk" (OuterVolumeSpecName: "kube-api-access-rwlkk") pod "d8f6f410-bfc3-4310-8547-a2d32ee42c93" (UID: "d8f6f410-bfc3-4310-8547-a2d32ee42c93"). InnerVolumeSpecName "kube-api-access-rwlkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:48:51 crc kubenswrapper[4594]: I1129 06:48:51.119124 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwlkk\" (UniqueName: \"kubernetes.io/projected/d8f6f410-bfc3-4310-8547-a2d32ee42c93-kube-api-access-rwlkk\") on node \"crc\" DevicePath \"\"" Nov 29 06:48:51 crc kubenswrapper[4594]: I1129 06:48:51.785893 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w4njb/crc-debug-zp9lg" event={"ID":"d8f6f410-bfc3-4310-8547-a2d32ee42c93","Type":"ContainerDied","Data":"3c8c7b0a4b19f7f3e75d9872b53a82c6cebfeabcd5bc10c68339398d65e737df"} Nov 29 06:48:51 crc kubenswrapper[4594]: I1129 06:48:51.786220 4594 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c8c7b0a4b19f7f3e75d9872b53a82c6cebfeabcd5bc10c68339398d65e737df" Nov 29 06:48:51 crc kubenswrapper[4594]: I1129 06:48:51.786034 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/crc-debug-zp9lg" Nov 29 06:48:52 crc kubenswrapper[4594]: I1129 06:48:52.044941 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w4njb/crc-debug-zp9lg"] Nov 29 06:48:52 crc kubenswrapper[4594]: I1129 06:48:52.056720 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w4njb/crc-debug-zp9lg"] Nov 29 06:48:52 crc kubenswrapper[4594]: I1129 06:48:52.111474 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f6f410-bfc3-4310-8547-a2d32ee42c93" path="/var/lib/kubelet/pods/d8f6f410-bfc3-4310-8547-a2d32ee42c93/volumes" Nov 29 06:48:53 crc kubenswrapper[4594]: I1129 06:48:53.083860 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:48:53 crc kubenswrapper[4594]: E1129 06:48:53.084390 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:48:53 crc kubenswrapper[4594]: I1129 06:48:53.219495 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w4njb/crc-debug-6tccv"] Nov 29 06:48:53 crc kubenswrapper[4594]: E1129 06:48:53.219919 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f6f410-bfc3-4310-8547-a2d32ee42c93" containerName="container-00" Nov 29 06:48:53 crc kubenswrapper[4594]: I1129 06:48:53.219939 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f6f410-bfc3-4310-8547-a2d32ee42c93" containerName="container-00" Nov 29 06:48:53 crc kubenswrapper[4594]: I1129 06:48:53.220168 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f6f410-bfc3-4310-8547-a2d32ee42c93" containerName="container-00" Nov 29 06:48:53 crc kubenswrapper[4594]: I1129 06:48:53.220940 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/crc-debug-6tccv" Nov 29 06:48:53 crc kubenswrapper[4594]: I1129 06:48:53.373962 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dd49\" (UniqueName: \"kubernetes.io/projected/0270f6c1-c532-4af6-a55f-15dd8ed7ce08-kube-api-access-7dd49\") pod \"crc-debug-6tccv\" (UID: \"0270f6c1-c532-4af6-a55f-15dd8ed7ce08\") " pod="openshift-must-gather-w4njb/crc-debug-6tccv" Nov 29 06:48:53 crc kubenswrapper[4594]: I1129 06:48:53.374216 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0270f6c1-c532-4af6-a55f-15dd8ed7ce08-host\") pod \"crc-debug-6tccv\" (UID: \"0270f6c1-c532-4af6-a55f-15dd8ed7ce08\") " pod="openshift-must-gather-w4njb/crc-debug-6tccv" Nov 29 06:48:53 crc kubenswrapper[4594]: I1129 06:48:53.476185 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dd49\" (UniqueName: \"kubernetes.io/projected/0270f6c1-c532-4af6-a55f-15dd8ed7ce08-kube-api-access-7dd49\") pod \"crc-debug-6tccv\" (UID: \"0270f6c1-c532-4af6-a55f-15dd8ed7ce08\") " pod="openshift-must-gather-w4njb/crc-debug-6tccv" Nov 29 06:48:53 crc kubenswrapper[4594]: I1129 06:48:53.476387 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0270f6c1-c532-4af6-a55f-15dd8ed7ce08-host\") pod \"crc-debug-6tccv\" (UID: \"0270f6c1-c532-4af6-a55f-15dd8ed7ce08\") " pod="openshift-must-gather-w4njb/crc-debug-6tccv" Nov 29 06:48:53 crc kubenswrapper[4594]: I1129 06:48:53.476540 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0270f6c1-c532-4af6-a55f-15dd8ed7ce08-host\") pod \"crc-debug-6tccv\" (UID: \"0270f6c1-c532-4af6-a55f-15dd8ed7ce08\") " pod="openshift-must-gather-w4njb/crc-debug-6tccv" Nov 29 06:48:53 crc kubenswrapper[4594]: I1129 06:48:53.498063 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dd49\" (UniqueName: \"kubernetes.io/projected/0270f6c1-c532-4af6-a55f-15dd8ed7ce08-kube-api-access-7dd49\") pod \"crc-debug-6tccv\" (UID: \"0270f6c1-c532-4af6-a55f-15dd8ed7ce08\") " pod="openshift-must-gather-w4njb/crc-debug-6tccv" Nov 29 06:48:53 crc kubenswrapper[4594]: I1129 06:48:53.537174 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/crc-debug-6tccv" Nov 29 06:48:53 crc kubenswrapper[4594]: I1129 06:48:53.809997 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w4njb/crc-debug-6tccv" event={"ID":"0270f6c1-c532-4af6-a55f-15dd8ed7ce08","Type":"ContainerStarted","Data":"8e5e9db20714364b22ee0feb0d08d9bff99f3d052087ef1c0df97a06e2260b11"} Nov 29 06:48:54 crc kubenswrapper[4594]: I1129 06:48:54.824169 4594 generic.go:334] "Generic (PLEG): container finished" podID="0270f6c1-c532-4af6-a55f-15dd8ed7ce08" containerID="1da8da29efcf675e068efdb2fa2338b1b82daf55da2adc53c9744f40046ec9c2" exitCode=0 Nov 29 06:48:54 crc kubenswrapper[4594]: I1129 06:48:54.824312 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w4njb/crc-debug-6tccv" event={"ID":"0270f6c1-c532-4af6-a55f-15dd8ed7ce08","Type":"ContainerDied","Data":"1da8da29efcf675e068efdb2fa2338b1b82daf55da2adc53c9744f40046ec9c2"} Nov 29 06:48:54 crc kubenswrapper[4594]: I1129 06:48:54.868777 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w4njb/crc-debug-6tccv"] Nov 29 06:48:54 crc kubenswrapper[4594]: I1129 06:48:54.877994 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w4njb/crc-debug-6tccv"] Nov 29 06:48:55 crc kubenswrapper[4594]: I1129 06:48:55.926696 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/crc-debug-6tccv" Nov 29 06:48:56 crc kubenswrapper[4594]: I1129 06:48:56.037621 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0270f6c1-c532-4af6-a55f-15dd8ed7ce08-host\") pod \"0270f6c1-c532-4af6-a55f-15dd8ed7ce08\" (UID: \"0270f6c1-c532-4af6-a55f-15dd8ed7ce08\") " Nov 29 06:48:56 crc kubenswrapper[4594]: I1129 06:48:56.037762 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0270f6c1-c532-4af6-a55f-15dd8ed7ce08-host" (OuterVolumeSpecName: "host") pod "0270f6c1-c532-4af6-a55f-15dd8ed7ce08" (UID: "0270f6c1-c532-4af6-a55f-15dd8ed7ce08"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 06:48:56 crc kubenswrapper[4594]: I1129 06:48:56.037905 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dd49\" (UniqueName: \"kubernetes.io/projected/0270f6c1-c532-4af6-a55f-15dd8ed7ce08-kube-api-access-7dd49\") pod \"0270f6c1-c532-4af6-a55f-15dd8ed7ce08\" (UID: \"0270f6c1-c532-4af6-a55f-15dd8ed7ce08\") " Nov 29 06:48:56 crc kubenswrapper[4594]: I1129 06:48:56.038825 4594 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0270f6c1-c532-4af6-a55f-15dd8ed7ce08-host\") on node \"crc\" DevicePath \"\"" Nov 29 06:48:56 crc kubenswrapper[4594]: I1129 06:48:56.044206 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0270f6c1-c532-4af6-a55f-15dd8ed7ce08-kube-api-access-7dd49" (OuterVolumeSpecName: "kube-api-access-7dd49") pod "0270f6c1-c532-4af6-a55f-15dd8ed7ce08" (UID: "0270f6c1-c532-4af6-a55f-15dd8ed7ce08"). InnerVolumeSpecName "kube-api-access-7dd49". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:48:56 crc kubenswrapper[4594]: I1129 06:48:56.094741 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0270f6c1-c532-4af6-a55f-15dd8ed7ce08" path="/var/lib/kubelet/pods/0270f6c1-c532-4af6-a55f-15dd8ed7ce08/volumes" Nov 29 06:48:56 crc kubenswrapper[4594]: I1129 06:48:56.146328 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dd49\" (UniqueName: \"kubernetes.io/projected/0270f6c1-c532-4af6-a55f-15dd8ed7ce08-kube-api-access-7dd49\") on node \"crc\" DevicePath \"\"" Nov 29 06:48:56 crc kubenswrapper[4594]: I1129 06:48:56.843084 4594 scope.go:117] "RemoveContainer" containerID="1da8da29efcf675e068efdb2fa2338b1b82daf55da2adc53c9744f40046ec9c2" Nov 29 06:48:56 crc kubenswrapper[4594]: I1129 06:48:56.843274 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/crc-debug-6tccv" Nov 29 06:49:08 crc kubenswrapper[4594]: I1129 06:49:08.084340 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:49:08 crc kubenswrapper[4594]: E1129 06:49:08.085102 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:49:22 crc kubenswrapper[4594]: I1129 06:49:22.085432 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:49:23 crc kubenswrapper[4594]: I1129 06:49:23.092004 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"dc8dbe12cdc8fe8bc0472ec5e2f11705190226ebe04481175e6560c74d7fe05c"} Nov 29 06:49:26 crc kubenswrapper[4594]: I1129 06:49:26.175761 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d8fb9b558-k2gdh_81f68040-3d0b-4f18-85fb-3f29b28c8fbe/barbican-api/0.log" Nov 29 06:49:26 crc kubenswrapper[4594]: I1129 06:49:26.226045 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d8fb9b558-k2gdh_81f68040-3d0b-4f18-85fb-3f29b28c8fbe/barbican-api-log/0.log" Nov 29 06:49:26 crc kubenswrapper[4594]: I1129 06:49:26.368567 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c89ff55f4-zl5h6_744bbd71-1ab1-492d-9148-37be600ef9c8/barbican-keystone-listener/0.log" Nov 29 06:49:26 crc kubenswrapper[4594]: I1129 06:49:26.432483 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c89ff55f4-zl5h6_744bbd71-1ab1-492d-9148-37be600ef9c8/barbican-keystone-listener-log/0.log" Nov 29 06:49:26 crc kubenswrapper[4594]: I1129 06:49:26.553629 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-776755c9f7-9ghn5_9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374/barbican-worker-log/0.log" Nov 29 06:49:26 crc kubenswrapper[4594]: I1129 06:49:26.577341 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-776755c9f7-9ghn5_9c5ddf8c-3bb2-4d2e-bf45-0535f59a2374/barbican-worker/0.log" Nov 29 06:49:26 crc kubenswrapper[4594]: I1129 06:49:26.676573 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-s5mf7_2a01c522-360e-4b2a-8b7e-4e5618fe1541/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:26 crc kubenswrapper[4594]: I1129 06:49:26.840696 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c118ff62-66e3-4359-9122-ebc78d1a1f3d/ceilometer-central-agent/0.log" Nov 29 06:49:26 crc kubenswrapper[4594]: I1129 06:49:26.871038 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c118ff62-66e3-4359-9122-ebc78d1a1f3d/proxy-httpd/0.log" Nov 29 06:49:26 crc kubenswrapper[4594]: I1129 06:49:26.884523 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c118ff62-66e3-4359-9122-ebc78d1a1f3d/ceilometer-notification-agent/0.log" Nov 29 06:49:26 crc kubenswrapper[4594]: I1129 06:49:26.942926 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c118ff62-66e3-4359-9122-ebc78d1a1f3d/sg-core/0.log" Nov 29 06:49:27 crc kubenswrapper[4594]: I1129 06:49:27.110643 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d704d9f4-1a8a-4cc8-af37-371bcc9b254b/cinder-api-log/0.log" Nov 29 06:49:27 crc kubenswrapper[4594]: I1129 06:49:27.334849 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d704d9f4-1a8a-4cc8-af37-371bcc9b254b/cinder-api/0.log" Nov 29 06:49:27 crc kubenswrapper[4594]: I1129 06:49:27.372466 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a816de0b-732c-46f3-ba52-2a7630623d5b/cinder-scheduler/0.log" Nov 29 06:49:27 crc kubenswrapper[4594]: I1129 06:49:27.374900 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a816de0b-732c-46f3-ba52-2a7630623d5b/probe/0.log" Nov 29 06:49:27 crc kubenswrapper[4594]: I1129 06:49:27.557201 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-srf6x_12960501-d688-4937-b0b3-048b780072d3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:27 crc kubenswrapper[4594]: I1129 06:49:27.584534 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-rfc6c_e4ac87df-2d62-4571-a38a-a9cd25537685/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:27 crc kubenswrapper[4594]: I1129 06:49:27.734166 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-bgkdt_23dcced9-156e-4d68-82c3-43b9b2a0d9be/init/0.log" Nov 29 06:49:27 crc kubenswrapper[4594]: I1129 06:49:27.937158 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-bgkdt_23dcced9-156e-4d68-82c3-43b9b2a0d9be/init/0.log" Nov 29 06:49:27 crc kubenswrapper[4594]: I1129 06:49:27.940802 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8fq52_6345164d-91bd-47df-b5a6-71f9940c0f15/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:28 crc kubenswrapper[4594]: I1129 06:49:28.035742 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-bgkdt_23dcced9-156e-4d68-82c3-43b9b2a0d9be/dnsmasq-dns/0.log" Nov 29 06:49:28 crc kubenswrapper[4594]: I1129 06:49:28.159866 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bfda0b74-99d7-4176-89f9-71d8385ddc6f/glance-httpd/0.log" Nov 29 06:49:28 crc kubenswrapper[4594]: I1129 06:49:28.183244 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bfda0b74-99d7-4176-89f9-71d8385ddc6f/glance-log/0.log" Nov 29 06:49:28 crc kubenswrapper[4594]: I1129 06:49:28.782853 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9f4133eb-5349-4bad-a993-d4e880a2f1be/glance-httpd/0.log" Nov 29 06:49:28 crc kubenswrapper[4594]: I1129 06:49:28.789868 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9f4133eb-5349-4bad-a993-d4e880a2f1be/glance-log/0.log" Nov 29 06:49:28 crc kubenswrapper[4594]: I1129 06:49:28.893015 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-558d4b85cb-k5j98_19dfde1f-d770-45ec-8735-78549b8fcb90/horizon/0.log" Nov 29 06:49:29 crc kubenswrapper[4594]: I1129 06:49:29.060330 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-tclxx_9dafdf20-2acb-46ad-adb3-d1421087ca5e/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:29 crc kubenswrapper[4594]: I1129 06:49:29.310006 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-qthxb_692e39b7-fc9f-4770-847e-ff968ddf1ad8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:29 crc kubenswrapper[4594]: I1129 06:49:29.446927 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-558d4b85cb-k5j98_19dfde1f-d770-45ec-8735-78549b8fcb90/horizon-log/0.log" Nov 29 06:49:29 crc kubenswrapper[4594]: I1129 06:49:29.653016 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6a4026a4-228e-4aa5-be23-c9b7e203c011/kube-state-metrics/0.log" Nov 29 06:49:29 crc kubenswrapper[4594]: I1129 06:49:29.665872 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29406601-5wlh9_45955c1f-8326-47e3-ba4b-3c6ea134e496/keystone-cron/0.log" Nov 29 06:49:29 crc kubenswrapper[4594]: I1129 06:49:29.736541 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-87b885ff4-zwt2r_2e56e5f5-25c3-4bbb-a9ca-47aec5d22564/keystone-api/0.log" Nov 29 06:49:29 crc kubenswrapper[4594]: I1129 06:49:29.760981 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wnzvc_ec93e11c-8754-4e8c-8e75-d563fb7cef1f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:30 crc kubenswrapper[4594]: I1129 06:49:30.571373 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c7df66d69-hd8nh_23f6e7de-b25b-4522-8368-cd17f44dc109/neutron-httpd/0.log" Nov 29 06:49:30 crc kubenswrapper[4594]: I1129 06:49:30.645874 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-m7rxx_a7f3ca11-a3f9-4bd2-a31f-dba6fec2138b/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:30 crc kubenswrapper[4594]: I1129 06:49:30.675424 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c7df66d69-hd8nh_23f6e7de-b25b-4522-8368-cd17f44dc109/neutron-api/0.log" Nov 29 06:49:31 crc kubenswrapper[4594]: I1129 06:49:31.288416 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_81a5ed74-39d5-4b41-8083-04c1a6f6f119/nova-cell0-conductor-conductor/0.log" Nov 29 06:49:31 crc kubenswrapper[4594]: I1129 06:49:31.506767 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f69f0fb6-b307-4c01-a90e-edf23e3858e1/nova-cell1-conductor-conductor/0.log" Nov 29 06:49:31 crc kubenswrapper[4594]: I1129 06:49:31.852720 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9cbd6039-37fe-4ad5-9149-441d6e5d1812/nova-cell1-novncproxy-novncproxy/0.log" Nov 29 06:49:32 crc kubenswrapper[4594]: I1129 06:49:32.011313 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-txzng_e5b69862-fd4c-4f01-977a-3d7f9bcce932/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:32 crc kubenswrapper[4594]: I1129 06:49:32.261026 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e/nova-api-log/0.log" Nov 29 06:49:32 crc kubenswrapper[4594]: I1129 06:49:32.326132 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7c168bec-2ad5-431a-ad8e-ef04de7635b4/nova-metadata-log/0.log" Nov 29 06:49:32 crc kubenswrapper[4594]: I1129 06:49:32.680649 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_76860e6e-cd4c-4f9e-ad65-f1cec5dfe17e/nova-api-api/0.log" Nov 29 06:49:32 crc kubenswrapper[4594]: I1129 06:49:32.858192 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1ab0ecb8-fc35-4934-b62a-6912d56e9001/mysql-bootstrap/0.log" Nov 29 06:49:32 crc kubenswrapper[4594]: I1129 06:49:32.929926 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ae32608f-19a0-4825-8fab-36c89e217b50/nova-scheduler-scheduler/0.log" Nov 29 06:49:33 crc kubenswrapper[4594]: I1129 06:49:33.051608 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1ab0ecb8-fc35-4934-b62a-6912d56e9001/galera/0.log" Nov 29 06:49:33 crc kubenswrapper[4594]: I1129 06:49:33.091918 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1ab0ecb8-fc35-4934-b62a-6912d56e9001/mysql-bootstrap/0.log" Nov 29 06:49:33 crc kubenswrapper[4594]: I1129 06:49:33.220034 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_caf7dae9-7fe9-4cf5-a5d0-39122397592e/mysql-bootstrap/0.log" Nov 29 06:49:33 crc kubenswrapper[4594]: I1129 06:49:33.392849 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_caf7dae9-7fe9-4cf5-a5d0-39122397592e/mysql-bootstrap/0.log" Nov 29 06:49:33 crc kubenswrapper[4594]: I1129 06:49:33.696937 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_caf7dae9-7fe9-4cf5-a5d0-39122397592e/galera/0.log" Nov 29 06:49:33 crc kubenswrapper[4594]: I1129 06:49:33.790513 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3629612b-cfc5-42bc-8584-4abc21ce4b3f/openstackclient/0.log" Nov 29 06:49:33 crc kubenswrapper[4594]: I1129 06:49:33.903824 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-b9h9k_a7baec31-60ce-4be4-8901-a8cbe7bf7ea9/ovn-controller/0.log" Nov 29 06:49:34 crc kubenswrapper[4594]: I1129 06:49:34.107389 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-qlxwj_a2d82ed9-466f-47e4-973d-0e88270f1021/openstack-network-exporter/0.log" Nov 29 06:49:34 crc kubenswrapper[4594]: I1129 06:49:34.163402 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7c168bec-2ad5-431a-ad8e-ef04de7635b4/nova-metadata-metadata/0.log" Nov 29 06:49:34 crc kubenswrapper[4594]: I1129 06:49:34.255723 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvkw9_0b167600-fe30-4499-876c-57685a803c45/ovsdb-server-init/0.log" Nov 29 06:49:34 crc kubenswrapper[4594]: I1129 06:49:34.464331 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvkw9_0b167600-fe30-4499-876c-57685a803c45/ovsdb-server-init/0.log" Nov 29 06:49:34 crc kubenswrapper[4594]: I1129 06:49:34.478677 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvkw9_0b167600-fe30-4499-876c-57685a803c45/ovsdb-server/0.log" Nov 29 06:49:34 crc kubenswrapper[4594]: I1129 06:49:34.709131 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_21284ba8-9492-4f6e-84c8-88d3844f386b/openstack-network-exporter/0.log" Nov 29 06:49:34 crc kubenswrapper[4594]: I1129 06:49:34.709377 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-rgrwm_3cd5d33f-c80d-49d9-97b7-26dc98be7fa7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:34 crc kubenswrapper[4594]: I1129 06:49:34.756832 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvkw9_0b167600-fe30-4499-876c-57685a803c45/ovs-vswitchd/0.log" Nov 29 06:49:34 crc kubenswrapper[4594]: I1129 06:49:34.863142 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_31fad155-3970-4a5d-a357-e96fa27bbb54/openstack-network-exporter/0.log" Nov 29 06:49:34 crc kubenswrapper[4594]: I1129 06:49:34.916076 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_21284ba8-9492-4f6e-84c8-88d3844f386b/ovn-northd/0.log" Nov 29 06:49:34 crc kubenswrapper[4594]: I1129 06:49:34.986796 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_31fad155-3970-4a5d-a357-e96fa27bbb54/ovsdbserver-nb/0.log" Nov 29 06:49:35 crc kubenswrapper[4594]: I1129 06:49:35.084214 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4f5075e0-210c-455f-8203-3dde7c7be5eb/openstack-network-exporter/0.log" Nov 29 06:49:35 crc kubenswrapper[4594]: I1129 06:49:35.170360 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4f5075e0-210c-455f-8203-3dde7c7be5eb/ovsdbserver-sb/0.log" Nov 29 06:49:35 crc kubenswrapper[4594]: I1129 06:49:35.448140 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55c5775b88-9dvcb_95ecbb59-1c3f-4561-a175-ffbd99d0496f/placement-api/0.log" Nov 29 06:49:35 crc kubenswrapper[4594]: I1129 06:49:35.483493 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83de9c5a-c56c-4433-9a32-48972cdd1b46/init-config-reloader/0.log" Nov 29 06:49:35 crc kubenswrapper[4594]: I1129 06:49:35.549558 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55c5775b88-9dvcb_95ecbb59-1c3f-4561-a175-ffbd99d0496f/placement-log/0.log" Nov 29 06:49:35 crc kubenswrapper[4594]: I1129 06:49:35.701493 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83de9c5a-c56c-4433-9a32-48972cdd1b46/init-config-reloader/0.log" Nov 29 06:49:35 crc kubenswrapper[4594]: I1129 06:49:35.773238 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83de9c5a-c56c-4433-9a32-48972cdd1b46/prometheus/0.log" Nov 29 06:49:35 crc kubenswrapper[4594]: I1129 06:49:35.791386 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83de9c5a-c56c-4433-9a32-48972cdd1b46/config-reloader/0.log" Nov 29 06:49:35 crc kubenswrapper[4594]: I1129 06:49:35.833078 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83de9c5a-c56c-4433-9a32-48972cdd1b46/thanos-sidecar/0.log" Nov 29 06:49:35 crc kubenswrapper[4594]: I1129 06:49:35.989771 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3dcf9a3e-9869-4630-a695-c180db93aca7/setup-container/0.log" Nov 29 06:49:36 crc kubenswrapper[4594]: I1129 06:49:36.174914 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3dcf9a3e-9869-4630-a695-c180db93aca7/setup-container/0.log" Nov 29 06:49:36 crc kubenswrapper[4594]: I1129 06:49:36.228244 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_9667e68c-f715-4663-bddb-53c53d3a593d/setup-container/0.log" Nov 29 06:49:36 crc kubenswrapper[4594]: I1129 06:49:36.241926 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3dcf9a3e-9869-4630-a695-c180db93aca7/rabbitmq/0.log" Nov 29 06:49:36 crc kubenswrapper[4594]: I1129 06:49:36.394616 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_9667e68c-f715-4663-bddb-53c53d3a593d/setup-container/0.log" Nov 29 06:49:36 crc kubenswrapper[4594]: I1129 06:49:36.404962 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_9667e68c-f715-4663-bddb-53c53d3a593d/rabbitmq/0.log" Nov 29 06:49:36 crc kubenswrapper[4594]: I1129 06:49:36.521747 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bb17ce90-d0e2-4a46-905b-e27bff2295fb/setup-container/0.log" Nov 29 06:49:36 crc kubenswrapper[4594]: I1129 06:49:36.673694 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bb17ce90-d0e2-4a46-905b-e27bff2295fb/setup-container/0.log" Nov 29 06:49:36 crc kubenswrapper[4594]: I1129 06:49:36.721421 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bb17ce90-d0e2-4a46-905b-e27bff2295fb/rabbitmq/0.log" Nov 29 06:49:36 crc kubenswrapper[4594]: I1129 06:49:36.757073 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-dknvp_41f4ce3b-5711-4b51-a22d-5fcbda6153ac/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:36 crc kubenswrapper[4594]: I1129 06:49:36.951786 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-s7gxb_b16b2100-7eea-43dd-8b1c-f2c337bdb3bd/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:36 crc kubenswrapper[4594]: I1129 06:49:36.984894 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-7cxvz_db8e580b-8fbe-4e91-bb94-023bf1b2903b/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:37 crc kubenswrapper[4594]: I1129 06:49:37.171822 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-6gzlk_fad852f8-f0f3-4fa6-9196-58c24259a3a6/ssh-known-hosts-edpm-deployment/0.log" Nov 29 06:49:37 crc kubenswrapper[4594]: I1129 06:49:37.180706 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-kxwqm_795787e2-6c07-4e76-98ae-38a13aae294a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:37 crc kubenswrapper[4594]: I1129 06:49:37.821424 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d689f55f9-c4bt7_bb181390-82bf-4bd9-9063-9272988db515/proxy-server/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.004999 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d689f55f9-c4bt7_bb181390-82bf-4bd9-9063-9272988db515/proxy-httpd/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.024151 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-wtghb_f63cf943-e9c0-4f70-9f7b-8ecf859c92ae/swift-ring-rebalance/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.108212 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/account-auditor/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.208981 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/account-reaper/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.270767 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/account-server/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.283494 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/account-replicator/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.332517 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/container-auditor/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.430366 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/container-replicator/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.492466 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/container-server/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.553241 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/container-updater/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.628184 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/object-auditor/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.632031 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/object-expirer/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.727037 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/object-replicator/0.log" Nov 29 06:49:38 crc kubenswrapper[4594]: I1129 06:49:38.747575 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/object-server/0.log" Nov 29 06:49:39 crc kubenswrapper[4594]: I1129 06:49:39.408992 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/rsync/0.log" Nov 29 06:49:39 crc kubenswrapper[4594]: I1129 06:49:39.434432 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/object-updater/0.log" Nov 29 06:49:39 crc kubenswrapper[4594]: I1129 06:49:39.490133 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15a27bc9-a74a-4123-b693-baf16a0ed04d/swift-recon-cron/0.log" Nov 29 06:49:39 crc kubenswrapper[4594]: I1129 06:49:39.638776 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-2b9zg_2e564f32-c761-4816-9715-7636294bd4c4/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:39 crc kubenswrapper[4594]: I1129 06:49:39.710735 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ffaebcb6-c9b1-4dcf-a887-1c1d4e140bad/tempest-tests-tempest-tests-runner/0.log" Nov 29 06:49:39 crc kubenswrapper[4594]: I1129 06:49:39.808968 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_ceac441b-e510-49a4-aefb-26fad57552d2/test-operator-logs-container/0.log" Nov 29 06:49:39 crc kubenswrapper[4594]: I1129 06:49:39.917467 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-ntkjh_627ba28b-319a-4072-bd92-d9b1a9e77283/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 06:49:40 crc kubenswrapper[4594]: I1129 06:49:40.735510 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_cfbd6de3-fc8d-4d93-a76d-fd2b8a196167/watcher-applier/0.log" Nov 29 06:49:41 crc kubenswrapper[4594]: I1129 06:49:41.293095 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_4082da7a-cc96-4b67-a101-48600e49712b/watcher-api-log/0.log" Nov 29 06:49:43 crc kubenswrapper[4594]: I1129 06:49:43.637420 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_716cbd33-cb95-4be2-a9c9-98c742ee4e17/watcher-decision-engine/0.log" Nov 29 06:49:44 crc kubenswrapper[4594]: I1129 06:49:44.455762 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_4082da7a-cc96-4b67-a101-48600e49712b/watcher-api/0.log" Nov 29 06:49:46 crc kubenswrapper[4594]: I1129 06:49:46.142462 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_9952b1d2-2cce-45f4-b370-5d0107f80260/memcached/0.log" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.088750 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2stlq"] Nov 29 06:49:55 crc kubenswrapper[4594]: E1129 06:49:55.089669 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0270f6c1-c532-4af6-a55f-15dd8ed7ce08" containerName="container-00" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.089685 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="0270f6c1-c532-4af6-a55f-15dd8ed7ce08" containerName="container-00" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.089898 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="0270f6c1-c532-4af6-a55f-15dd8ed7ce08" containerName="container-00" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.091217 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.097163 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2stlq"] Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.174587 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpc4v\" (UniqueName: \"kubernetes.io/projected/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-kube-api-access-tpc4v\") pod \"certified-operators-2stlq\" (UID: \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\") " pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.174756 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-catalog-content\") pod \"certified-operators-2stlq\" (UID: \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\") " pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.174818 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-utilities\") pod \"certified-operators-2stlq\" (UID: \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\") " pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.276863 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpc4v\" (UniqueName: \"kubernetes.io/projected/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-kube-api-access-tpc4v\") pod \"certified-operators-2stlq\" (UID: \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\") " pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.277054 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-catalog-content\") pod \"certified-operators-2stlq\" (UID: \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\") " pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.277137 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-utilities\") pod \"certified-operators-2stlq\" (UID: \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\") " pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.277518 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-catalog-content\") pod \"certified-operators-2stlq\" (UID: \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\") " pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.277636 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-utilities\") pod \"certified-operators-2stlq\" (UID: \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\") " pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.310654 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpc4v\" (UniqueName: \"kubernetes.io/projected/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-kube-api-access-tpc4v\") pod \"certified-operators-2stlq\" (UID: \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\") " pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.421415 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:49:55 crc kubenswrapper[4594]: I1129 06:49:55.876771 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2stlq"] Nov 29 06:49:56 crc kubenswrapper[4594]: I1129 06:49:56.394144 4594 generic.go:334] "Generic (PLEG): container finished" podID="22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" containerID="c6b308878b35a6dc8f4512a4da02ebb48fba792b6569261c7cb9de0550abbdf3" exitCode=0 Nov 29 06:49:56 crc kubenswrapper[4594]: I1129 06:49:56.394216 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2stlq" event={"ID":"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3","Type":"ContainerDied","Data":"c6b308878b35a6dc8f4512a4da02ebb48fba792b6569261c7cb9de0550abbdf3"} Nov 29 06:49:56 crc kubenswrapper[4594]: I1129 06:49:56.394270 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2stlq" event={"ID":"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3","Type":"ContainerStarted","Data":"e26512fc5061d9cf048ef6f8ac774d92e1fe1f2b0b8824b29ed98d18de87bc71"} Nov 29 06:49:57 crc kubenswrapper[4594]: I1129 06:49:57.403832 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2stlq" event={"ID":"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3","Type":"ContainerStarted","Data":"8dcdc46fc7245ded0f8a0e3b1e8b3bf8ec1d82bc7e10a832c225818509717422"} Nov 29 06:49:58 crc kubenswrapper[4594]: I1129 06:49:58.415760 4594 generic.go:334] "Generic (PLEG): container finished" podID="22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" containerID="8dcdc46fc7245ded0f8a0e3b1e8b3bf8ec1d82bc7e10a832c225818509717422" exitCode=0 Nov 29 06:49:58 crc kubenswrapper[4594]: I1129 06:49:58.415874 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2stlq" event={"ID":"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3","Type":"ContainerDied","Data":"8dcdc46fc7245ded0f8a0e3b1e8b3bf8ec1d82bc7e10a832c225818509717422"} Nov 29 06:49:59 crc kubenswrapper[4594]: I1129 06:49:59.427636 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2stlq" event={"ID":"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3","Type":"ContainerStarted","Data":"9dd59777ffde0089d93f3fea10c9198dce8d187dfde32de245acb40a416ff63a"} Nov 29 06:49:59 crc kubenswrapper[4594]: I1129 06:49:59.450144 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2stlq" podStartSLOduration=1.904764778 podStartE2EDuration="4.450127321s" podCreationTimestamp="2025-11-29 06:49:55 +0000 UTC" firstStartedPulling="2025-11-29 06:49:56.397102586 +0000 UTC m=+4920.637611806" lastFinishedPulling="2025-11-29 06:49:58.942465129 +0000 UTC m=+4923.182974349" observedRunningTime="2025-11-29 06:49:59.442134194 +0000 UTC m=+4923.682643415" watchObservedRunningTime="2025-11-29 06:49:59.450127321 +0000 UTC m=+4923.690636541" Nov 29 06:50:04 crc kubenswrapper[4594]: I1129 06:50:04.533319 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/util/0.log" Nov 29 06:50:04 crc kubenswrapper[4594]: I1129 06:50:04.639923 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/util/0.log" Nov 29 06:50:04 crc kubenswrapper[4594]: I1129 06:50:04.649463 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/pull/0.log" Nov 29 06:50:04 crc kubenswrapper[4594]: I1129 06:50:04.694124 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/pull/0.log" Nov 29 06:50:04 crc kubenswrapper[4594]: I1129 06:50:04.830025 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/pull/0.log" Nov 29 06:50:04 crc kubenswrapper[4594]: I1129 06:50:04.834629 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/util/0.log" Nov 29 06:50:04 crc kubenswrapper[4594]: I1129 06:50:04.842391 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_34453e0ae93d07abc4f6e497f8998de77c1bdd8f20510be6b58912cf3b75mmm_605f6125-2bfd-43a1-b01b-4ea4f391b981/extract/0.log" Nov 29 06:50:04 crc kubenswrapper[4594]: I1129 06:50:04.986837 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jc4r5_8c96e02a-d3bd-4904-8ade-baecb4c3a280/kube-rbac-proxy/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.014527 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jc4r5_8c96e02a-d3bd-4904-8ade-baecb4c3a280/manager/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.040944 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-xlmc8_f50eb91c-7f6c-4d0f-b32a-c9ead1766b9c/kube-rbac-proxy/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.171443 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-g6nl9_95648e23-46bf-4160-9527-7ad1c84f9883/kube-rbac-proxy/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.185693 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-xlmc8_f50eb91c-7f6c-4d0f-b32a-c9ead1766b9c/manager/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.232712 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-g6nl9_95648e23-46bf-4160-9527-7ad1c84f9883/manager/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.349864 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-s254f_e564c6c7-7145-411f-b48f-d8e2594c34a5/kube-rbac-proxy/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.404911 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-s254f_e564c6c7-7145-411f-b48f-d8e2594c34a5/manager/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.421555 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.422437 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.464011 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.513890 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.523827 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-mcfvv_01841a60-a638-4a78-84d4-01ad474bf2fb/manager/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.525652 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-mcfvv_01841a60-a638-4a78-84d4-01ad474bf2fb/kube-rbac-proxy/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.633199 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-z9q2v_09d8a0a7-cc55-4654-8e59-a769c806eecf/kube-rbac-proxy/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.700763 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2stlq"] Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.706195 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-z9q2v_09d8a0a7-cc55-4654-8e59-a769c806eecf/manager/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.768912 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-6c7gr_0ee70d2e-b283-468b-8bd8-016a120b5ae8/kube-rbac-proxy/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.894274 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-qgzzn_fb42365c-18e1-4456-ae16-be77a16f102c/kube-rbac-proxy/0.log" Nov 29 06:50:05 crc kubenswrapper[4594]: I1129 06:50:05.923887 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-qgzzn_fb42365c-18e1-4456-ae16-be77a16f102c/manager/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.013646 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-6c7gr_0ee70d2e-b283-468b-8bd8-016a120b5ae8/manager/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.100220 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-ppdt9_fbdf482e-3aa0-4f5c-a698-949ad6cb6992/kube-rbac-proxy/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.172353 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-ppdt9_fbdf482e-3aa0-4f5c-a698-949ad6cb6992/manager/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.248276 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-5bng6_61fc53be-18c3-48bb-9a4a-2557df78afc7/kube-rbac-proxy/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.249868 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-5bng6_61fc53be-18c3-48bb-9a4a-2557df78afc7/manager/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.398159 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-48tfk_a1b0453c-c84d-45ea-90be-7f01a831f987/kube-rbac-proxy/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.438764 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-48tfk_a1b0453c-c84d-45ea-90be-7f01a831f987/manager/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.504783 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-d84hx_2de2c23f-39bd-4a9d-9965-7fe280b61707/kube-rbac-proxy/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.589666 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-d84hx_2de2c23f-39bd-4a9d-9965-7fe280b61707/manager/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.633962 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7fmfc_7d8912e8-8f81-4ea6-94c2-7e56c7726e58/kube-rbac-proxy/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.730501 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7fmfc_7d8912e8-8f81-4ea6-94c2-7e56c7726e58/manager/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.760164 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-245g8_f9ab855a-f938-4ad6-941a-52f4e5b7d4b2/kube-rbac-proxy/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.804305 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-245g8_f9ab855a-f938-4ad6-941a-52f4e5b7d4b2/manager/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.939636 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6698bcb446zrvcn_82d8e084-bca0-43f0-9d6f-63df84cd28a6/manager/0.log" Nov 29 06:50:06 crc kubenswrapper[4594]: I1129 06:50:06.950370 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6698bcb446zrvcn_82d8e084-bca0-43f0-9d6f-63df84cd28a6/kube-rbac-proxy/0.log" Nov 29 06:50:07 crc kubenswrapper[4594]: I1129 06:50:07.365690 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6ddddd9d6f-7q4j2_880b9d6a-5dc6-448b-a63c-b098fcc54023/operator/0.log" Nov 29 06:50:07 crc kubenswrapper[4594]: I1129 06:50:07.447218 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zmbtt_9d8ef423-1563-4fda-92d3-dcbd15f10b13/registry-server/0.log" Nov 29 06:50:07 crc kubenswrapper[4594]: I1129 06:50:07.492573 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2stlq" podUID="22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" containerName="registry-server" containerID="cri-o://9dd59777ffde0089d93f3fea10c9198dce8d187dfde32de245acb40a416ff63a" gracePeriod=2 Nov 29 06:50:07 crc kubenswrapper[4594]: I1129 06:50:07.567580 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-pclb6_460651bc-3f62-4bc6-ab53-e791ea16993e/kube-rbac-proxy/0.log" Nov 29 06:50:07 crc kubenswrapper[4594]: I1129 06:50:07.803075 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-pclb6_460651bc-3f62-4bc6-ab53-e791ea16993e/manager/0.log" Nov 29 06:50:07 crc kubenswrapper[4594]: I1129 06:50:07.847293 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-7n6mf_5cde207a-7c2e-46d9-809a-72b8749560a6/kube-rbac-proxy/0.log" Nov 29 06:50:07 crc kubenswrapper[4594]: I1129 06:50:07.968189 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-7n6mf_5cde207a-7c2e-46d9-809a-72b8749560a6/manager/0.log" Nov 29 06:50:07 crc kubenswrapper[4594]: I1129 06:50:07.990517 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:50:07 crc kubenswrapper[4594]: I1129 06:50:07.998276 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-utilities\") pod \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\" (UID: \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\") " Nov 29 06:50:07 crc kubenswrapper[4594]: I1129 06:50:07.998314 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-catalog-content\") pod \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\" (UID: \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\") " Nov 29 06:50:07 crc kubenswrapper[4594]: I1129 06:50:07.998609 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpc4v\" (UniqueName: \"kubernetes.io/projected/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-kube-api-access-tpc4v\") pod \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\" (UID: \"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3\") " Nov 29 06:50:07 crc kubenswrapper[4594]: I1129 06:50:07.998942 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-utilities" (OuterVolumeSpecName: "utilities") pod "22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" (UID: "22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.022881 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-kube-api-access-tpc4v" (OuterVolumeSpecName: "kube-api-access-tpc4v") pod "22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" (UID: "22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3"). InnerVolumeSpecName "kube-api-access-tpc4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.047779 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" (UID: "22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.102293 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.102326 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.102336 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpc4v\" (UniqueName: \"kubernetes.io/projected/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3-kube-api-access-tpc4v\") on node \"crc\" DevicePath \"\"" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.144671 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4mk5r_3fdff03c-acc7-4274-bb06-83abd0f7b432/operator/0.log" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.242857 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-dd8x6_d8cfbdeb-cd3d-425e-a96c-d9a565c840c3/kube-rbac-proxy/0.log" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.278819 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-dd8x6_d8cfbdeb-cd3d-425e-a96c-d9a565c840c3/manager/0.log" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.357458 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-656fd97d56-fcmzw_505c6e79-1776-4995-a6b5-5888f75c141c/manager/0.log" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.396176 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-zccgx_b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0/kube-rbac-proxy/0.log" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.502879 4594 generic.go:334] "Generic (PLEG): container finished" podID="22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" containerID="9dd59777ffde0089d93f3fea10c9198dce8d187dfde32de245acb40a416ff63a" exitCode=0 Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.502921 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2stlq" event={"ID":"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3","Type":"ContainerDied","Data":"9dd59777ffde0089d93f3fea10c9198dce8d187dfde32de245acb40a416ff63a"} Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.502954 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2stlq" event={"ID":"22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3","Type":"ContainerDied","Data":"e26512fc5061d9cf048ef6f8ac774d92e1fe1f2b0b8824b29ed98d18de87bc71"} Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.502975 4594 scope.go:117] "RemoveContainer" containerID="9dd59777ffde0089d93f3fea10c9198dce8d187dfde32de245acb40a416ff63a" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.503097 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2stlq" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.539482 4594 scope.go:117] "RemoveContainer" containerID="8dcdc46fc7245ded0f8a0e3b1e8b3bf8ec1d82bc7e10a832c225818509717422" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.542322 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2stlq"] Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.558577 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2stlq"] Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.570385 4594 scope.go:117] "RemoveContainer" containerID="c6b308878b35a6dc8f4512a4da02ebb48fba792b6569261c7cb9de0550abbdf3" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.623227 4594 scope.go:117] "RemoveContainer" containerID="9dd59777ffde0089d93f3fea10c9198dce8d187dfde32de245acb40a416ff63a" Nov 29 06:50:08 crc kubenswrapper[4594]: E1129 06:50:08.626653 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd59777ffde0089d93f3fea10c9198dce8d187dfde32de245acb40a416ff63a\": container with ID starting with 9dd59777ffde0089d93f3fea10c9198dce8d187dfde32de245acb40a416ff63a not found: ID does not exist" containerID="9dd59777ffde0089d93f3fea10c9198dce8d187dfde32de245acb40a416ff63a" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.626708 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd59777ffde0089d93f3fea10c9198dce8d187dfde32de245acb40a416ff63a"} err="failed to get container status \"9dd59777ffde0089d93f3fea10c9198dce8d187dfde32de245acb40a416ff63a\": rpc error: code = NotFound desc = could not find container \"9dd59777ffde0089d93f3fea10c9198dce8d187dfde32de245acb40a416ff63a\": container with ID starting with 9dd59777ffde0089d93f3fea10c9198dce8d187dfde32de245acb40a416ff63a not found: ID does not exist" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.626742 4594 scope.go:117] "RemoveContainer" containerID="8dcdc46fc7245ded0f8a0e3b1e8b3bf8ec1d82bc7e10a832c225818509717422" Nov 29 06:50:08 crc kubenswrapper[4594]: E1129 06:50:08.627103 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dcdc46fc7245ded0f8a0e3b1e8b3bf8ec1d82bc7e10a832c225818509717422\": container with ID starting with 8dcdc46fc7245ded0f8a0e3b1e8b3bf8ec1d82bc7e10a832c225818509717422 not found: ID does not exist" containerID="8dcdc46fc7245ded0f8a0e3b1e8b3bf8ec1d82bc7e10a832c225818509717422" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.627159 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dcdc46fc7245ded0f8a0e3b1e8b3bf8ec1d82bc7e10a832c225818509717422"} err="failed to get container status \"8dcdc46fc7245ded0f8a0e3b1e8b3bf8ec1d82bc7e10a832c225818509717422\": rpc error: code = NotFound desc = could not find container \"8dcdc46fc7245ded0f8a0e3b1e8b3bf8ec1d82bc7e10a832c225818509717422\": container with ID starting with 8dcdc46fc7245ded0f8a0e3b1e8b3bf8ec1d82bc7e10a832c225818509717422 not found: ID does not exist" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.627190 4594 scope.go:117] "RemoveContainer" containerID="c6b308878b35a6dc8f4512a4da02ebb48fba792b6569261c7cb9de0550abbdf3" Nov 29 06:50:08 crc kubenswrapper[4594]: E1129 06:50:08.627604 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b308878b35a6dc8f4512a4da02ebb48fba792b6569261c7cb9de0550abbdf3\": container with ID starting with c6b308878b35a6dc8f4512a4da02ebb48fba792b6569261c7cb9de0550abbdf3 not found: ID does not exist" containerID="c6b308878b35a6dc8f4512a4da02ebb48fba792b6569261c7cb9de0550abbdf3" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.627653 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b308878b35a6dc8f4512a4da02ebb48fba792b6569261c7cb9de0550abbdf3"} err="failed to get container status \"c6b308878b35a6dc8f4512a4da02ebb48fba792b6569261c7cb9de0550abbdf3\": rpc error: code = NotFound desc = could not find container \"c6b308878b35a6dc8f4512a4da02ebb48fba792b6569261c7cb9de0550abbdf3\": container with ID starting with c6b308878b35a6dc8f4512a4da02ebb48fba792b6569261c7cb9de0550abbdf3 not found: ID does not exist" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.777343 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-r28h2_9e1ee1b6-8684-4fcb-a26c-fd85a950abcc/kube-rbac-proxy/0.log" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.795268 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-r28h2_9e1ee1b6-8684-4fcb-a26c-fd85a950abcc/manager/0.log" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.803368 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-zccgx_b818ec3c-b3d3-4c0b-b2d0-dc75c1e0a4c0/manager/0.log" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.936199 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-6gxvv_8a5e35f1-00df-4307-b197-f7800c641af7/kube-rbac-proxy/0.log" Nov 29 06:50:08 crc kubenswrapper[4594]: I1129 06:50:08.982837 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-6gxvv_8a5e35f1-00df-4307-b197-f7800c641af7/manager/0.log" Nov 29 06:50:10 crc kubenswrapper[4594]: I1129 06:50:10.096488 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" path="/var/lib/kubelet/pods/22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3/volumes" Nov 29 06:50:25 crc kubenswrapper[4594]: I1129 06:50:25.173794 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gspdt_1fea2129-7ad0-45d8-9447-315107ef1c0c/control-plane-machine-set-operator/0.log" Nov 29 06:50:25 crc kubenswrapper[4594]: I1129 06:50:25.295118 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f9dqp_09980e91-b2e4-4a0e-bee7-dc101096f804/kube-rbac-proxy/0.log" Nov 29 06:50:25 crc kubenswrapper[4594]: I1129 06:50:25.346203 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f9dqp_09980e91-b2e4-4a0e-bee7-dc101096f804/machine-api-operator/0.log" Nov 29 06:50:35 crc kubenswrapper[4594]: I1129 06:50:35.372091 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-4z2w6_edf7ce03-4be9-42d1-8a58-e0f132c43299/cert-manager-controller/0.log" Nov 29 06:50:35 crc kubenswrapper[4594]: I1129 06:50:35.538561 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-btrfm_f4838603-21ba-451a-a9d4-d5415bc4b52a/cert-manager-cainjector/0.log" Nov 29 06:50:35 crc kubenswrapper[4594]: I1129 06:50:35.547666 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-4srms_f288eb1f-58a0-4e9b-9b63-ba15bedb38ec/cert-manager-webhook/0.log" Nov 29 06:50:46 crc kubenswrapper[4594]: I1129 06:50:46.683662 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-kj52s_060d894f-3918-4f8c-8b70-33d7e18b316d/nmstate-console-plugin/0.log" Nov 29 06:50:46 crc kubenswrapper[4594]: I1129 06:50:46.852682 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lp45r_4eb3a18a-b943-4996-8977-8c442eca7e9e/kube-rbac-proxy/0.log" Nov 29 06:50:46 crc kubenswrapper[4594]: I1129 06:50:46.859146 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-lp45r_4eb3a18a-b943-4996-8977-8c442eca7e9e/nmstate-metrics/0.log" Nov 29 06:50:46 crc kubenswrapper[4594]: I1129 06:50:46.867878 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-szl2t_92357f02-ea29-48b6-b763-f6e1b8ca3457/nmstate-handler/0.log" Nov 29 06:50:47 crc kubenswrapper[4594]: I1129 06:50:47.005566 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-gp4lk_ae0a89d0-92c1-4884-a38f-34cf97da3de5/nmstate-operator/0.log" Nov 29 06:50:47 crc kubenswrapper[4594]: I1129 06:50:47.060032 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-7g6km_75ccbb22-479b-4415-aa0d-c00853a463ee/nmstate-webhook/0.log" Nov 29 06:51:00 crc kubenswrapper[4594]: I1129 06:51:00.760484 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-44rgs_c67ed7c7-a9cb-4068-80af-9356fd171e31/kube-rbac-proxy/0.log" Nov 29 06:51:00 crc kubenswrapper[4594]: I1129 06:51:00.895709 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-44rgs_c67ed7c7-a9cb-4068-80af-9356fd171e31/controller/0.log" Nov 29 06:51:00 crc kubenswrapper[4594]: I1129 06:51:00.941334 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-frr-files/0.log" Nov 29 06:51:01 crc kubenswrapper[4594]: I1129 06:51:01.199744 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-reloader/0.log" Nov 29 06:51:01 crc kubenswrapper[4594]: I1129 06:51:01.200901 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-frr-files/0.log" Nov 29 06:51:01 crc kubenswrapper[4594]: I1129 06:51:01.235788 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-metrics/0.log" Nov 29 06:51:01 crc kubenswrapper[4594]: I1129 06:51:01.244055 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-reloader/0.log" Nov 29 06:51:01 crc kubenswrapper[4594]: I1129 06:51:01.437726 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-reloader/0.log" Nov 29 06:51:01 crc kubenswrapper[4594]: I1129 06:51:01.472991 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-frr-files/0.log" Nov 29 06:51:01 crc kubenswrapper[4594]: I1129 06:51:01.479681 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-metrics/0.log" Nov 29 06:51:01 crc kubenswrapper[4594]: I1129 06:51:01.493130 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-metrics/0.log" Nov 29 06:51:01 crc kubenswrapper[4594]: I1129 06:51:01.956565 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/controller/0.log" Nov 29 06:51:01 crc kubenswrapper[4594]: I1129 06:51:01.962847 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-frr-files/0.log" Nov 29 06:51:01 crc kubenswrapper[4594]: I1129 06:51:01.989300 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-reloader/0.log" Nov 29 06:51:01 crc kubenswrapper[4594]: I1129 06:51:01.998402 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/cp-metrics/0.log" Nov 29 06:51:02 crc kubenswrapper[4594]: I1129 06:51:02.136904 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/frr-metrics/0.log" Nov 29 06:51:02 crc kubenswrapper[4594]: I1129 06:51:02.179235 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/kube-rbac-proxy-frr/0.log" Nov 29 06:51:02 crc kubenswrapper[4594]: I1129 06:51:02.201994 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/kube-rbac-proxy/0.log" Nov 29 06:51:02 crc kubenswrapper[4594]: I1129 06:51:02.379951 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/reloader/0.log" Nov 29 06:51:02 crc kubenswrapper[4594]: I1129 06:51:02.464982 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-9bwd7_69e15172-74bf-4295-a7a9-a7843b1da728/frr-k8s-webhook-server/0.log" Nov 29 06:51:02 crc kubenswrapper[4594]: I1129 06:51:02.562359 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-79b995c45-klk7s_a1369887-80c2-44ef-b566-f30184ea9607/manager/0.log" Nov 29 06:51:02 crc kubenswrapper[4594]: I1129 06:51:02.739633 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7667cbc88-mqfqp_ce5a6997-d8a6-489f-9bcf-d77879d7ad46/webhook-server/0.log" Nov 29 06:51:03 crc kubenswrapper[4594]: I1129 06:51:03.068143 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5fh4t_a2f948fa-edac-4ac6-9ffb-e5ee886f8164/kube-rbac-proxy/0.log" Nov 29 06:51:03 crc kubenswrapper[4594]: I1129 06:51:03.578620 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5fh4t_a2f948fa-edac-4ac6-9ffb-e5ee886f8164/speaker/0.log" Nov 29 06:51:03 crc kubenswrapper[4594]: I1129 06:51:03.623052 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmsgj_dd301093-7d62-4edf-8811-4f7529bba358/frr/0.log" Nov 29 06:51:13 crc kubenswrapper[4594]: I1129 06:51:13.736484 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/util/0.log" Nov 29 06:51:13 crc kubenswrapper[4594]: I1129 06:51:13.914002 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/util/0.log" Nov 29 06:51:13 crc kubenswrapper[4594]: I1129 06:51:13.950276 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/pull/0.log" Nov 29 06:51:13 crc kubenswrapper[4594]: I1129 06:51:13.981297 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/pull/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.101945 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/util/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.125357 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/extract/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.167738 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fsn6z4_837b6915-2ae6-4f64-af4f-029c8d1012d3/pull/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.241765 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/util/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.386752 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/util/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.423464 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/pull/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.437892 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/pull/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.585578 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/util/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.600408 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/pull/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.609459 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210jzp72_3b9451fa-96ca-42c8-888f-beba143e0850/extract/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.749926 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/util/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.899492 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/pull/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.929316 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/util/0.log" Nov 29 06:51:14 crc kubenswrapper[4594]: I1129 06:51:14.943015 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/pull/0.log" Nov 29 06:51:15 crc kubenswrapper[4594]: I1129 06:51:15.090289 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/pull/0.log" Nov 29 06:51:15 crc kubenswrapper[4594]: I1129 06:51:15.099696 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/util/0.log" Nov 29 06:51:15 crc kubenswrapper[4594]: I1129 06:51:15.112674 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tbj4s_7a1d137f-f3e9-4543-966c-f5cfe3b3360d/extract/0.log" Nov 29 06:51:15 crc kubenswrapper[4594]: I1129 06:51:15.305708 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/extract-utilities/0.log" Nov 29 06:51:15 crc kubenswrapper[4594]: I1129 06:51:15.434964 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/extract-content/0.log" Nov 29 06:51:15 crc kubenswrapper[4594]: I1129 06:51:15.439074 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/extract-utilities/0.log" Nov 29 06:51:15 crc kubenswrapper[4594]: I1129 06:51:15.439680 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/extract-content/0.log" Nov 29 06:51:15 crc kubenswrapper[4594]: I1129 06:51:15.587439 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/extract-content/0.log" Nov 29 06:51:15 crc kubenswrapper[4594]: I1129 06:51:15.626424 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/extract-utilities/0.log" Nov 29 06:51:15 crc kubenswrapper[4594]: I1129 06:51:15.792240 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/extract-utilities/0.log" Nov 29 06:51:16 crc kubenswrapper[4594]: I1129 06:51:16.013421 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/extract-utilities/0.log" Nov 29 06:51:16 crc kubenswrapper[4594]: I1129 06:51:16.093810 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/extract-content/0.log" Nov 29 06:51:16 crc kubenswrapper[4594]: I1129 06:51:16.155955 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/extract-content/0.log" Nov 29 06:51:16 crc kubenswrapper[4594]: I1129 06:51:16.183243 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kfzdq_fa62a8c9-aa8c-42d9-a634-f3aea0992e00/registry-server/0.log" Nov 29 06:51:16 crc kubenswrapper[4594]: I1129 06:51:16.256650 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/extract-utilities/0.log" Nov 29 06:51:16 crc kubenswrapper[4594]: I1129 06:51:16.341318 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/extract-content/0.log" Nov 29 06:51:16 crc kubenswrapper[4594]: I1129 06:51:16.618554 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hqrxt_b9f1bdf8-188a-4fdb-bf69-41579c5827ce/registry-server/0.log" Nov 29 06:51:16 crc kubenswrapper[4594]: I1129 06:51:16.664343 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-clxws_8b4bf1cb-440a-4e74-82c4-c122e9985bf3/marketplace-operator/0.log" Nov 29 06:51:16 crc kubenswrapper[4594]: I1129 06:51:16.771706 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/extract-utilities/0.log" Nov 29 06:51:16 crc kubenswrapper[4594]: I1129 06:51:16.917067 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/extract-utilities/0.log" Nov 29 06:51:16 crc kubenswrapper[4594]: I1129 06:51:16.927729 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/extract-content/0.log" Nov 29 06:51:16 crc kubenswrapper[4594]: I1129 06:51:16.940598 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/extract-content/0.log" Nov 29 06:51:17 crc kubenswrapper[4594]: I1129 06:51:17.055739 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/extract-utilities/0.log" Nov 29 06:51:17 crc kubenswrapper[4594]: I1129 06:51:17.098279 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/extract-content/0.log" Nov 29 06:51:17 crc kubenswrapper[4594]: I1129 06:51:17.188892 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/extract-utilities/0.log" Nov 29 06:51:17 crc kubenswrapper[4594]: I1129 06:51:17.236678 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bk4kp_9bf183a2-75c4-4800-a888-f41d978b1c1d/registry-server/0.log" Nov 29 06:51:17 crc kubenswrapper[4594]: I1129 06:51:17.335836 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/extract-content/0.log" Nov 29 06:51:17 crc kubenswrapper[4594]: I1129 06:51:17.352266 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/extract-content/0.log" Nov 29 06:51:17 crc kubenswrapper[4594]: I1129 06:51:17.352310 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/extract-utilities/0.log" Nov 29 06:51:17 crc kubenswrapper[4594]: I1129 06:51:17.503073 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/extract-content/0.log" Nov 29 06:51:17 crc kubenswrapper[4594]: I1129 06:51:17.549697 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/extract-utilities/0.log" Nov 29 06:51:18 crc kubenswrapper[4594]: I1129 06:51:18.058514 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9cznn_2f6c6bde-10f1-4dbf-863a-153a95f825b7/registry-server/0.log" Nov 29 06:51:29 crc kubenswrapper[4594]: I1129 06:51:29.881600 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-lj6hq_862c89f5-7442-4e20-8677-ef780f71545d/prometheus-operator/0.log" Nov 29 06:51:30 crc kubenswrapper[4594]: I1129 06:51:30.006491 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5bd94b577b-2p9zv_8ec2d68d-f88d-411c-9790-4fc800a02905/prometheus-operator-admission-webhook/0.log" Nov 29 06:51:30 crc kubenswrapper[4594]: I1129 06:51:30.051219 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5bd94b577b-8kh5q_0283828f-9a3f-4c00-8409-a49231f3b953/prometheus-operator-admission-webhook/0.log" Nov 29 06:51:30 crc kubenswrapper[4594]: I1129 06:51:30.186978 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-l25w2_57d47f4c-b8cb-4c20-9adb-2e9190e48f82/operator/0.log" Nov 29 06:51:30 crc kubenswrapper[4594]: I1129 06:51:30.241497 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-rt8w5_ad950138-e417-48eb-a3e9-a5c575d4507f/perses-operator/0.log" Nov 29 06:51:45 crc kubenswrapper[4594]: I1129 06:51:45.799762 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:51:45 crc kubenswrapper[4594]: I1129 06:51:45.800321 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:51:48 crc kubenswrapper[4594]: E1129 06:51:48.438536 4594 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.25.120:42398->192.168.25.120:45015: write tcp 192.168.25.120:42398->192.168.25.120:45015: write: connection reset by peer Nov 29 06:52:15 crc kubenswrapper[4594]: I1129 06:52:15.800125 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:52:15 crc kubenswrapper[4594]: I1129 06:52:15.800719 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:52:45 crc kubenswrapper[4594]: I1129 06:52:45.801007 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:52:45 crc kubenswrapper[4594]: I1129 06:52:45.801733 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:52:45 crc kubenswrapper[4594]: I1129 06:52:45.801805 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 06:52:45 crc kubenswrapper[4594]: I1129 06:52:45.803050 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dc8dbe12cdc8fe8bc0472ec5e2f11705190226ebe04481175e6560c74d7fe05c"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 06:52:45 crc kubenswrapper[4594]: I1129 06:52:45.803124 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://dc8dbe12cdc8fe8bc0472ec5e2f11705190226ebe04481175e6560c74d7fe05c" gracePeriod=600 Nov 29 06:52:46 crc kubenswrapper[4594]: I1129 06:52:46.029108 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="dc8dbe12cdc8fe8bc0472ec5e2f11705190226ebe04481175e6560c74d7fe05c" exitCode=0 Nov 29 06:52:46 crc kubenswrapper[4594]: I1129 06:52:46.029465 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"dc8dbe12cdc8fe8bc0472ec5e2f11705190226ebe04481175e6560c74d7fe05c"} Nov 29 06:52:46 crc kubenswrapper[4594]: I1129 06:52:46.030023 4594 scope.go:117] "RemoveContainer" containerID="ca7803bff64492152d5cd81c155f02681d925d0c3977555e80faed8cfe165f84" Nov 29 06:52:47 crc kubenswrapper[4594]: I1129 06:52:47.042572 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerStarted","Data":"5ffe7072cba89740daf4243dca1380060d3742cb8a549792f7039027015a7e5a"} Nov 29 06:53:08 crc kubenswrapper[4594]: I1129 06:53:08.254229 4594 generic.go:334] "Generic (PLEG): container finished" podID="f9ad12f5-f7c6-43ed-9192-e963371115d4" containerID="404ba45c83a5184bc822c6899fa0a86965a5eab8a40e8e8cf91f5e87e1aaef69" exitCode=0 Nov 29 06:53:08 crc kubenswrapper[4594]: I1129 06:53:08.254448 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w4njb/must-gather-7grvm" event={"ID":"f9ad12f5-f7c6-43ed-9192-e963371115d4","Type":"ContainerDied","Data":"404ba45c83a5184bc822c6899fa0a86965a5eab8a40e8e8cf91f5e87e1aaef69"} Nov 29 06:53:08 crc kubenswrapper[4594]: I1129 06:53:08.256843 4594 scope.go:117] "RemoveContainer" containerID="404ba45c83a5184bc822c6899fa0a86965a5eab8a40e8e8cf91f5e87e1aaef69" Nov 29 06:53:08 crc kubenswrapper[4594]: I1129 06:53:08.319814 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w4njb_must-gather-7grvm_f9ad12f5-f7c6-43ed-9192-e963371115d4/gather/0.log" Nov 29 06:53:18 crc kubenswrapper[4594]: I1129 06:53:18.326386 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w4njb/must-gather-7grvm"] Nov 29 06:53:18 crc kubenswrapper[4594]: I1129 06:53:18.328222 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-w4njb/must-gather-7grvm" podUID="f9ad12f5-f7c6-43ed-9192-e963371115d4" containerName="copy" containerID="cri-o://14c472929807c1e5ba19924028f467cbe0fad82d928f1f198ba5d453e681e4e7" gracePeriod=2 Nov 29 06:53:18 crc kubenswrapper[4594]: I1129 06:53:18.335155 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w4njb/must-gather-7grvm"] Nov 29 06:53:18 crc kubenswrapper[4594]: E1129 06:53:18.443031 4594 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9ad12f5_f7c6_43ed_9192_e963371115d4.slice/crio-conmon-14c472929807c1e5ba19924028f467cbe0fad82d928f1f198ba5d453e681e4e7.scope\": RecentStats: unable to find data in memory cache]" Nov 29 06:53:18 crc kubenswrapper[4594]: I1129 06:53:18.708823 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w4njb_must-gather-7grvm_f9ad12f5-f7c6-43ed-9192-e963371115d4/copy/0.log" Nov 29 06:53:18 crc kubenswrapper[4594]: I1129 06:53:18.709596 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/must-gather-7grvm" Nov 29 06:53:18 crc kubenswrapper[4594]: I1129 06:53:18.718671 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52z45\" (UniqueName: \"kubernetes.io/projected/f9ad12f5-f7c6-43ed-9192-e963371115d4-kube-api-access-52z45\") pod \"f9ad12f5-f7c6-43ed-9192-e963371115d4\" (UID: \"f9ad12f5-f7c6-43ed-9192-e963371115d4\") " Nov 29 06:53:18 crc kubenswrapper[4594]: I1129 06:53:18.725434 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ad12f5-f7c6-43ed-9192-e963371115d4-kube-api-access-52z45" (OuterVolumeSpecName: "kube-api-access-52z45") pod "f9ad12f5-f7c6-43ed-9192-e963371115d4" (UID: "f9ad12f5-f7c6-43ed-9192-e963371115d4"). InnerVolumeSpecName "kube-api-access-52z45". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:53:18 crc kubenswrapper[4594]: I1129 06:53:18.819750 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9ad12f5-f7c6-43ed-9192-e963371115d4-must-gather-output\") pod \"f9ad12f5-f7c6-43ed-9192-e963371115d4\" (UID: \"f9ad12f5-f7c6-43ed-9192-e963371115d4\") " Nov 29 06:53:18 crc kubenswrapper[4594]: I1129 06:53:18.820207 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52z45\" (UniqueName: \"kubernetes.io/projected/f9ad12f5-f7c6-43ed-9192-e963371115d4-kube-api-access-52z45\") on node \"crc\" DevicePath \"\"" Nov 29 06:53:18 crc kubenswrapper[4594]: I1129 06:53:18.971912 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ad12f5-f7c6-43ed-9192-e963371115d4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f9ad12f5-f7c6-43ed-9192-e963371115d4" (UID: "f9ad12f5-f7c6-43ed-9192-e963371115d4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:53:19 crc kubenswrapper[4594]: I1129 06:53:19.038010 4594 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9ad12f5-f7c6-43ed-9192-e963371115d4-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 29 06:53:19 crc kubenswrapper[4594]: I1129 06:53:19.374628 4594 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w4njb_must-gather-7grvm_f9ad12f5-f7c6-43ed-9192-e963371115d4/copy/0.log" Nov 29 06:53:19 crc kubenswrapper[4594]: I1129 06:53:19.375435 4594 generic.go:334] "Generic (PLEG): container finished" podID="f9ad12f5-f7c6-43ed-9192-e963371115d4" containerID="14c472929807c1e5ba19924028f467cbe0fad82d928f1f198ba5d453e681e4e7" exitCode=143 Nov 29 06:53:19 crc kubenswrapper[4594]: I1129 06:53:19.375506 4594 scope.go:117] "RemoveContainer" containerID="14c472929807c1e5ba19924028f467cbe0fad82d928f1f198ba5d453e681e4e7" Nov 29 06:53:19 crc kubenswrapper[4594]: I1129 06:53:19.375537 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w4njb/must-gather-7grvm" Nov 29 06:53:19 crc kubenswrapper[4594]: I1129 06:53:19.395006 4594 scope.go:117] "RemoveContainer" containerID="404ba45c83a5184bc822c6899fa0a86965a5eab8a40e8e8cf91f5e87e1aaef69" Nov 29 06:53:19 crc kubenswrapper[4594]: I1129 06:53:19.439187 4594 scope.go:117] "RemoveContainer" containerID="14c472929807c1e5ba19924028f467cbe0fad82d928f1f198ba5d453e681e4e7" Nov 29 06:53:19 crc kubenswrapper[4594]: E1129 06:53:19.439785 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c472929807c1e5ba19924028f467cbe0fad82d928f1f198ba5d453e681e4e7\": container with ID starting with 14c472929807c1e5ba19924028f467cbe0fad82d928f1f198ba5d453e681e4e7 not found: ID does not exist" containerID="14c472929807c1e5ba19924028f467cbe0fad82d928f1f198ba5d453e681e4e7" Nov 29 06:53:19 crc kubenswrapper[4594]: I1129 06:53:19.439834 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c472929807c1e5ba19924028f467cbe0fad82d928f1f198ba5d453e681e4e7"} err="failed to get container status \"14c472929807c1e5ba19924028f467cbe0fad82d928f1f198ba5d453e681e4e7\": rpc error: code = NotFound desc = could not find container \"14c472929807c1e5ba19924028f467cbe0fad82d928f1f198ba5d453e681e4e7\": container with ID starting with 14c472929807c1e5ba19924028f467cbe0fad82d928f1f198ba5d453e681e4e7 not found: ID does not exist" Nov 29 06:53:19 crc kubenswrapper[4594]: I1129 06:53:19.439865 4594 scope.go:117] "RemoveContainer" containerID="404ba45c83a5184bc822c6899fa0a86965a5eab8a40e8e8cf91f5e87e1aaef69" Nov 29 06:53:19 crc kubenswrapper[4594]: E1129 06:53:19.440233 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"404ba45c83a5184bc822c6899fa0a86965a5eab8a40e8e8cf91f5e87e1aaef69\": container with ID starting with 404ba45c83a5184bc822c6899fa0a86965a5eab8a40e8e8cf91f5e87e1aaef69 not found: ID does not exist" containerID="404ba45c83a5184bc822c6899fa0a86965a5eab8a40e8e8cf91f5e87e1aaef69" Nov 29 06:53:19 crc kubenswrapper[4594]: I1129 06:53:19.440290 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"404ba45c83a5184bc822c6899fa0a86965a5eab8a40e8e8cf91f5e87e1aaef69"} err="failed to get container status \"404ba45c83a5184bc822c6899fa0a86965a5eab8a40e8e8cf91f5e87e1aaef69\": rpc error: code = NotFound desc = could not find container \"404ba45c83a5184bc822c6899fa0a86965a5eab8a40e8e8cf91f5e87e1aaef69\": container with ID starting with 404ba45c83a5184bc822c6899fa0a86965a5eab8a40e8e8cf91f5e87e1aaef69 not found: ID does not exist" Nov 29 06:53:20 crc kubenswrapper[4594]: I1129 06:53:20.093886 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ad12f5-f7c6-43ed-9192-e963371115d4" path="/var/lib/kubelet/pods/f9ad12f5-f7c6-43ed-9192-e963371115d4/volumes" Nov 29 06:54:12 crc kubenswrapper[4594]: I1129 06:54:12.151945 4594 scope.go:117] "RemoveContainer" containerID="fdbbcb2c54fda25f215343a7e9093fdad28783a094b22d0974ec56ea5c149d23" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.234243 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnk4"] Nov 29 06:54:37 crc kubenswrapper[4594]: E1129 06:54:37.235960 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ad12f5-f7c6-43ed-9192-e963371115d4" containerName="copy" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.236068 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ad12f5-f7c6-43ed-9192-e963371115d4" containerName="copy" Nov 29 06:54:37 crc kubenswrapper[4594]: E1129 06:54:37.236161 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" containerName="extract-utilities" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.236218 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" containerName="extract-utilities" Nov 29 06:54:37 crc kubenswrapper[4594]: E1129 06:54:37.236289 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ad12f5-f7c6-43ed-9192-e963371115d4" containerName="gather" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.236351 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ad12f5-f7c6-43ed-9192-e963371115d4" containerName="gather" Nov 29 06:54:37 crc kubenswrapper[4594]: E1129 06:54:37.236419 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" containerName="registry-server" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.236474 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" containerName="registry-server" Nov 29 06:54:37 crc kubenswrapper[4594]: E1129 06:54:37.236564 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" containerName="extract-content" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.236619 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" containerName="extract-content" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.236890 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="22fa56b1-bbe8-458c-a33a-d5fbbc14f9e3" containerName="registry-server" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.236958 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ad12f5-f7c6-43ed-9192-e963371115d4" containerName="gather" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.237015 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ad12f5-f7c6-43ed-9192-e963371115d4" containerName="copy" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.238564 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.245551 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnk4"] Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.322912 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-utilities\") pod \"redhat-marketplace-kqnk4\" (UID: \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\") " pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.323009 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76qgp\" (UniqueName: \"kubernetes.io/projected/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-kube-api-access-76qgp\") pod \"redhat-marketplace-kqnk4\" (UID: \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\") " pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.323344 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-catalog-content\") pod \"redhat-marketplace-kqnk4\" (UID: \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\") " pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.425223 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-catalog-content\") pod \"redhat-marketplace-kqnk4\" (UID: \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\") " pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.425335 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-utilities\") pod \"redhat-marketplace-kqnk4\" (UID: \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\") " pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.425356 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76qgp\" (UniqueName: \"kubernetes.io/projected/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-kube-api-access-76qgp\") pod \"redhat-marketplace-kqnk4\" (UID: \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\") " pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.425829 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-catalog-content\") pod \"redhat-marketplace-kqnk4\" (UID: \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\") " pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.425876 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-utilities\") pod \"redhat-marketplace-kqnk4\" (UID: \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\") " pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.441909 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76qgp\" (UniqueName: \"kubernetes.io/projected/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-kube-api-access-76qgp\") pod \"redhat-marketplace-kqnk4\" (UID: \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\") " pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.554813 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:37 crc kubenswrapper[4594]: I1129 06:54:37.965664 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnk4"] Nov 29 06:54:38 crc kubenswrapper[4594]: I1129 06:54:38.072093 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnk4" event={"ID":"2f385229-f7f9-4b2c-b786-f40a6f2b72b0","Type":"ContainerStarted","Data":"7e914713addbab162dd2e2aa331d05b599b4991fda29532caed0882b70fdde11"} Nov 29 06:54:39 crc kubenswrapper[4594]: I1129 06:54:39.082364 4594 generic.go:334] "Generic (PLEG): container finished" podID="2f385229-f7f9-4b2c-b786-f40a6f2b72b0" containerID="9c17941c8b40dc857f40f34b5c5f7ddcccb7dfc06adbc3ba8a86b20a4c6e445d" exitCode=0 Nov 29 06:54:39 crc kubenswrapper[4594]: I1129 06:54:39.082574 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnk4" event={"ID":"2f385229-f7f9-4b2c-b786-f40a6f2b72b0","Type":"ContainerDied","Data":"9c17941c8b40dc857f40f34b5c5f7ddcccb7dfc06adbc3ba8a86b20a4c6e445d"} Nov 29 06:54:39 crc kubenswrapper[4594]: I1129 06:54:39.084850 4594 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 06:54:40 crc kubenswrapper[4594]: I1129 06:54:40.094218 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnk4" event={"ID":"2f385229-f7f9-4b2c-b786-f40a6f2b72b0","Type":"ContainerStarted","Data":"e3bf646c4f7484fd7a25afc5bbc106e4d755ebe3ea43d47ecdd7e16bd63ced1b"} Nov 29 06:54:41 crc kubenswrapper[4594]: I1129 06:54:41.102472 4594 generic.go:334] "Generic (PLEG): container finished" podID="2f385229-f7f9-4b2c-b786-f40a6f2b72b0" containerID="e3bf646c4f7484fd7a25afc5bbc106e4d755ebe3ea43d47ecdd7e16bd63ced1b" exitCode=0 Nov 29 06:54:41 crc kubenswrapper[4594]: I1129 06:54:41.102546 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnk4" event={"ID":"2f385229-f7f9-4b2c-b786-f40a6f2b72b0","Type":"ContainerDied","Data":"e3bf646c4f7484fd7a25afc5bbc106e4d755ebe3ea43d47ecdd7e16bd63ced1b"} Nov 29 06:54:42 crc kubenswrapper[4594]: I1129 06:54:42.128972 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnk4" event={"ID":"2f385229-f7f9-4b2c-b786-f40a6f2b72b0","Type":"ContainerStarted","Data":"e784475b60569dc18bee9d6a028fe8c57aadbea4d561a56bdd17158e18a799eb"} Nov 29 06:54:42 crc kubenswrapper[4594]: I1129 06:54:42.148599 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kqnk4" podStartSLOduration=2.374107466 podStartE2EDuration="5.148582761s" podCreationTimestamp="2025-11-29 06:54:37 +0000 UTC" firstStartedPulling="2025-11-29 06:54:39.084603697 +0000 UTC m=+5203.325112917" lastFinishedPulling="2025-11-29 06:54:41.859078992 +0000 UTC m=+5206.099588212" observedRunningTime="2025-11-29 06:54:42.14107736 +0000 UTC m=+5206.381586580" watchObservedRunningTime="2025-11-29 06:54:42.148582761 +0000 UTC m=+5206.389091982" Nov 29 06:54:47 crc kubenswrapper[4594]: I1129 06:54:47.555937 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:47 crc kubenswrapper[4594]: I1129 06:54:47.556485 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:47 crc kubenswrapper[4594]: I1129 06:54:47.590571 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:48 crc kubenswrapper[4594]: I1129 06:54:48.211495 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:48 crc kubenswrapper[4594]: I1129 06:54:48.251816 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnk4"] Nov 29 06:54:50 crc kubenswrapper[4594]: I1129 06:54:50.190433 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kqnk4" podUID="2f385229-f7f9-4b2c-b786-f40a6f2b72b0" containerName="registry-server" containerID="cri-o://e784475b60569dc18bee9d6a028fe8c57aadbea4d561a56bdd17158e18a799eb" gracePeriod=2 Nov 29 06:54:50 crc kubenswrapper[4594]: I1129 06:54:50.886008 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.081667 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-catalog-content\") pod \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\" (UID: \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\") " Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.081719 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76qgp\" (UniqueName: \"kubernetes.io/projected/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-kube-api-access-76qgp\") pod \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\" (UID: \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\") " Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.081757 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-utilities\") pod \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\" (UID: \"2f385229-f7f9-4b2c-b786-f40a6f2b72b0\") " Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.082817 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-utilities" (OuterVolumeSpecName: "utilities") pod "2f385229-f7f9-4b2c-b786-f40a6f2b72b0" (UID: "2f385229-f7f9-4b2c-b786-f40a6f2b72b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.087115 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-kube-api-access-76qgp" (OuterVolumeSpecName: "kube-api-access-76qgp") pod "2f385229-f7f9-4b2c-b786-f40a6f2b72b0" (UID: "2f385229-f7f9-4b2c-b786-f40a6f2b72b0"). InnerVolumeSpecName "kube-api-access-76qgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.099157 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f385229-f7f9-4b2c-b786-f40a6f2b72b0" (UID: "2f385229-f7f9-4b2c-b786-f40a6f2b72b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.185575 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.185606 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76qgp\" (UniqueName: \"kubernetes.io/projected/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-kube-api-access-76qgp\") on node \"crc\" DevicePath \"\"" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.185617 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f385229-f7f9-4b2c-b786-f40a6f2b72b0-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.198560 4594 generic.go:334] "Generic (PLEG): container finished" podID="2f385229-f7f9-4b2c-b786-f40a6f2b72b0" containerID="e784475b60569dc18bee9d6a028fe8c57aadbea4d561a56bdd17158e18a799eb" exitCode=0 Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.198599 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnk4" event={"ID":"2f385229-f7f9-4b2c-b786-f40a6f2b72b0","Type":"ContainerDied","Data":"e784475b60569dc18bee9d6a028fe8c57aadbea4d561a56bdd17158e18a799eb"} Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.198622 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnk4" event={"ID":"2f385229-f7f9-4b2c-b786-f40a6f2b72b0","Type":"ContainerDied","Data":"7e914713addbab162dd2e2aa331d05b599b4991fda29532caed0882b70fdde11"} Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.198641 4594 scope.go:117] "RemoveContainer" containerID="e784475b60569dc18bee9d6a028fe8c57aadbea4d561a56bdd17158e18a799eb" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.198752 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqnk4" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.214192 4594 scope.go:117] "RemoveContainer" containerID="e3bf646c4f7484fd7a25afc5bbc106e4d755ebe3ea43d47ecdd7e16bd63ced1b" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.227018 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnk4"] Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.233768 4594 scope.go:117] "RemoveContainer" containerID="9c17941c8b40dc857f40f34b5c5f7ddcccb7dfc06adbc3ba8a86b20a4c6e445d" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.234466 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnk4"] Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.270437 4594 scope.go:117] "RemoveContainer" containerID="e784475b60569dc18bee9d6a028fe8c57aadbea4d561a56bdd17158e18a799eb" Nov 29 06:54:51 crc kubenswrapper[4594]: E1129 06:54:51.270969 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e784475b60569dc18bee9d6a028fe8c57aadbea4d561a56bdd17158e18a799eb\": container with ID starting with e784475b60569dc18bee9d6a028fe8c57aadbea4d561a56bdd17158e18a799eb not found: ID does not exist" containerID="e784475b60569dc18bee9d6a028fe8c57aadbea4d561a56bdd17158e18a799eb" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.271011 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e784475b60569dc18bee9d6a028fe8c57aadbea4d561a56bdd17158e18a799eb"} err="failed to get container status \"e784475b60569dc18bee9d6a028fe8c57aadbea4d561a56bdd17158e18a799eb\": rpc error: code = NotFound desc = could not find container \"e784475b60569dc18bee9d6a028fe8c57aadbea4d561a56bdd17158e18a799eb\": container with ID starting with e784475b60569dc18bee9d6a028fe8c57aadbea4d561a56bdd17158e18a799eb not found: ID does not exist" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.271046 4594 scope.go:117] "RemoveContainer" containerID="e3bf646c4f7484fd7a25afc5bbc106e4d755ebe3ea43d47ecdd7e16bd63ced1b" Nov 29 06:54:51 crc kubenswrapper[4594]: E1129 06:54:51.271331 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3bf646c4f7484fd7a25afc5bbc106e4d755ebe3ea43d47ecdd7e16bd63ced1b\": container with ID starting with e3bf646c4f7484fd7a25afc5bbc106e4d755ebe3ea43d47ecdd7e16bd63ced1b not found: ID does not exist" containerID="e3bf646c4f7484fd7a25afc5bbc106e4d755ebe3ea43d47ecdd7e16bd63ced1b" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.271366 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3bf646c4f7484fd7a25afc5bbc106e4d755ebe3ea43d47ecdd7e16bd63ced1b"} err="failed to get container status \"e3bf646c4f7484fd7a25afc5bbc106e4d755ebe3ea43d47ecdd7e16bd63ced1b\": rpc error: code = NotFound desc = could not find container \"e3bf646c4f7484fd7a25afc5bbc106e4d755ebe3ea43d47ecdd7e16bd63ced1b\": container with ID starting with e3bf646c4f7484fd7a25afc5bbc106e4d755ebe3ea43d47ecdd7e16bd63ced1b not found: ID does not exist" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.271405 4594 scope.go:117] "RemoveContainer" containerID="9c17941c8b40dc857f40f34b5c5f7ddcccb7dfc06adbc3ba8a86b20a4c6e445d" Nov 29 06:54:51 crc kubenswrapper[4594]: E1129 06:54:51.271619 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c17941c8b40dc857f40f34b5c5f7ddcccb7dfc06adbc3ba8a86b20a4c6e445d\": container with ID starting with 9c17941c8b40dc857f40f34b5c5f7ddcccb7dfc06adbc3ba8a86b20a4c6e445d not found: ID does not exist" containerID="9c17941c8b40dc857f40f34b5c5f7ddcccb7dfc06adbc3ba8a86b20a4c6e445d" Nov 29 06:54:51 crc kubenswrapper[4594]: I1129 06:54:51.271641 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c17941c8b40dc857f40f34b5c5f7ddcccb7dfc06adbc3ba8a86b20a4c6e445d"} err="failed to get container status \"9c17941c8b40dc857f40f34b5c5f7ddcccb7dfc06adbc3ba8a86b20a4c6e445d\": rpc error: code = NotFound desc = could not find container \"9c17941c8b40dc857f40f34b5c5f7ddcccb7dfc06adbc3ba8a86b20a4c6e445d\": container with ID starting with 9c17941c8b40dc857f40f34b5c5f7ddcccb7dfc06adbc3ba8a86b20a4c6e445d not found: ID does not exist" Nov 29 06:54:52 crc kubenswrapper[4594]: I1129 06:54:52.092637 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f385229-f7f9-4b2c-b786-f40a6f2b72b0" path="/var/lib/kubelet/pods/2f385229-f7f9-4b2c-b786-f40a6f2b72b0/volumes" Nov 29 06:55:12 crc kubenswrapper[4594]: I1129 06:55:12.234818 4594 scope.go:117] "RemoveContainer" containerID="28e0ff54182c6f22c89f731b6b54bdc2364fef91618c7fbd173da822e4fee9c6" Nov 29 06:55:15 crc kubenswrapper[4594]: I1129 06:55:15.800236 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:55:15 crc kubenswrapper[4594]: I1129 06:55:15.800978 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:55:45 crc kubenswrapper[4594]: I1129 06:55:45.800318 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:55:45 crc kubenswrapper[4594]: I1129 06:55:45.800897 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.493576 4594 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m5jpx"] Nov 29 06:56:04 crc kubenswrapper[4594]: E1129 06:56:04.494485 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f385229-f7f9-4b2c-b786-f40a6f2b72b0" containerName="extract-utilities" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.494500 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f385229-f7f9-4b2c-b786-f40a6f2b72b0" containerName="extract-utilities" Nov 29 06:56:04 crc kubenswrapper[4594]: E1129 06:56:04.494520 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f385229-f7f9-4b2c-b786-f40a6f2b72b0" containerName="extract-content" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.494525 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f385229-f7f9-4b2c-b786-f40a6f2b72b0" containerName="extract-content" Nov 29 06:56:04 crc kubenswrapper[4594]: E1129 06:56:04.494542 4594 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f385229-f7f9-4b2c-b786-f40a6f2b72b0" containerName="registry-server" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.494547 4594 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f385229-f7f9-4b2c-b786-f40a6f2b72b0" containerName="registry-server" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.494779 4594 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f385229-f7f9-4b2c-b786-f40a6f2b72b0" containerName="registry-server" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.496216 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.506540 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5jpx"] Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.560778 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c374c29c-b8dd-473f-b004-910bee33ea02-utilities\") pod \"redhat-operators-m5jpx\" (UID: \"c374c29c-b8dd-473f-b004-910bee33ea02\") " pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.560936 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c374c29c-b8dd-473f-b004-910bee33ea02-catalog-content\") pod \"redhat-operators-m5jpx\" (UID: \"c374c29c-b8dd-473f-b004-910bee33ea02\") " pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.561187 4594 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftn59\" (UniqueName: \"kubernetes.io/projected/c374c29c-b8dd-473f-b004-910bee33ea02-kube-api-access-ftn59\") pod \"redhat-operators-m5jpx\" (UID: \"c374c29c-b8dd-473f-b004-910bee33ea02\") " pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.662888 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftn59\" (UniqueName: \"kubernetes.io/projected/c374c29c-b8dd-473f-b004-910bee33ea02-kube-api-access-ftn59\") pod \"redhat-operators-m5jpx\" (UID: \"c374c29c-b8dd-473f-b004-910bee33ea02\") " pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.663016 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c374c29c-b8dd-473f-b004-910bee33ea02-utilities\") pod \"redhat-operators-m5jpx\" (UID: \"c374c29c-b8dd-473f-b004-910bee33ea02\") " pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.663205 4594 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c374c29c-b8dd-473f-b004-910bee33ea02-catalog-content\") pod \"redhat-operators-m5jpx\" (UID: \"c374c29c-b8dd-473f-b004-910bee33ea02\") " pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.663411 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c374c29c-b8dd-473f-b004-910bee33ea02-utilities\") pod \"redhat-operators-m5jpx\" (UID: \"c374c29c-b8dd-473f-b004-910bee33ea02\") " pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.663505 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c374c29c-b8dd-473f-b004-910bee33ea02-catalog-content\") pod \"redhat-operators-m5jpx\" (UID: \"c374c29c-b8dd-473f-b004-910bee33ea02\") " pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.686886 4594 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftn59\" (UniqueName: \"kubernetes.io/projected/c374c29c-b8dd-473f-b004-910bee33ea02-kube-api-access-ftn59\") pod \"redhat-operators-m5jpx\" (UID: \"c374c29c-b8dd-473f-b004-910bee33ea02\") " pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:04 crc kubenswrapper[4594]: I1129 06:56:04.827061 4594 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:05 crc kubenswrapper[4594]: I1129 06:56:05.245790 4594 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5jpx"] Nov 29 06:56:05 crc kubenswrapper[4594]: I1129 06:56:05.835513 4594 generic.go:334] "Generic (PLEG): container finished" podID="c374c29c-b8dd-473f-b004-910bee33ea02" containerID="6ea3d5470b0c2f871be5c34ed1e67db70aa15f836b45fd501b0ce0dfe554d712" exitCode=0 Nov 29 06:56:05 crc kubenswrapper[4594]: I1129 06:56:05.835589 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5jpx" event={"ID":"c374c29c-b8dd-473f-b004-910bee33ea02","Type":"ContainerDied","Data":"6ea3d5470b0c2f871be5c34ed1e67db70aa15f836b45fd501b0ce0dfe554d712"} Nov 29 06:56:05 crc kubenswrapper[4594]: I1129 06:56:05.835794 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5jpx" event={"ID":"c374c29c-b8dd-473f-b004-910bee33ea02","Type":"ContainerStarted","Data":"eaab74a2410569c79136b9617d07a5c57dcad3d76e89340090e8f9faa6703815"} Nov 29 06:56:07 crc kubenswrapper[4594]: I1129 06:56:07.853709 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5jpx" event={"ID":"c374c29c-b8dd-473f-b004-910bee33ea02","Type":"ContainerStarted","Data":"446397dabd2a10eebc3239b3cfd79bd4855f0f7ff85224813449566193a9d253"} Nov 29 06:56:08 crc kubenswrapper[4594]: I1129 06:56:08.863541 4594 generic.go:334] "Generic (PLEG): container finished" podID="c374c29c-b8dd-473f-b004-910bee33ea02" containerID="446397dabd2a10eebc3239b3cfd79bd4855f0f7ff85224813449566193a9d253" exitCode=0 Nov 29 06:56:08 crc kubenswrapper[4594]: I1129 06:56:08.863591 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5jpx" event={"ID":"c374c29c-b8dd-473f-b004-910bee33ea02","Type":"ContainerDied","Data":"446397dabd2a10eebc3239b3cfd79bd4855f0f7ff85224813449566193a9d253"} Nov 29 06:56:09 crc kubenswrapper[4594]: I1129 06:56:09.877948 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5jpx" event={"ID":"c374c29c-b8dd-473f-b004-910bee33ea02","Type":"ContainerStarted","Data":"2077b94ef0a9ed8e1ff8ae4524b1e2ab8d60f76ef6ac14537361b23fad1e2fce"} Nov 29 06:56:09 crc kubenswrapper[4594]: I1129 06:56:09.904450 4594 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m5jpx" podStartSLOduration=2.32614736 podStartE2EDuration="5.904431036s" podCreationTimestamp="2025-11-29 06:56:04 +0000 UTC" firstStartedPulling="2025-11-29 06:56:05.837637217 +0000 UTC m=+5290.078146437" lastFinishedPulling="2025-11-29 06:56:09.415920893 +0000 UTC m=+5293.656430113" observedRunningTime="2025-11-29 06:56:09.899793276 +0000 UTC m=+5294.140302495" watchObservedRunningTime="2025-11-29 06:56:09.904431036 +0000 UTC m=+5294.144940255" Nov 29 06:56:14 crc kubenswrapper[4594]: I1129 06:56:14.827999 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:14 crc kubenswrapper[4594]: I1129 06:56:14.830987 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:14 crc kubenswrapper[4594]: I1129 06:56:14.870900 4594 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:14 crc kubenswrapper[4594]: I1129 06:56:14.958860 4594 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:15 crc kubenswrapper[4594]: I1129 06:56:15.800504 4594 patch_prober.go:28] interesting pod/machine-config-daemon-ggz4n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 06:56:15 crc kubenswrapper[4594]: I1129 06:56:15.800558 4594 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 06:56:15 crc kubenswrapper[4594]: I1129 06:56:15.800604 4594 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" Nov 29 06:56:15 crc kubenswrapper[4594]: I1129 06:56:15.801342 4594 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ffe7072cba89740daf4243dca1380060d3742cb8a549792f7039027015a7e5a"} pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 06:56:15 crc kubenswrapper[4594]: I1129 06:56:15.801410 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerName="machine-config-daemon" containerID="cri-o://5ffe7072cba89740daf4243dca1380060d3742cb8a549792f7039027015a7e5a" gracePeriod=600 Nov 29 06:56:15 crc kubenswrapper[4594]: E1129 06:56:15.930753 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:56:15 crc kubenswrapper[4594]: I1129 06:56:15.933741 4594 generic.go:334] "Generic (PLEG): container finished" podID="509844d4-3ee1-4059-84bb-6e90200f50c5" containerID="5ffe7072cba89740daf4243dca1380060d3742cb8a549792f7039027015a7e5a" exitCode=0 Nov 29 06:56:15 crc kubenswrapper[4594]: I1129 06:56:15.933818 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" event={"ID":"509844d4-3ee1-4059-84bb-6e90200f50c5","Type":"ContainerDied","Data":"5ffe7072cba89740daf4243dca1380060d3742cb8a549792f7039027015a7e5a"} Nov 29 06:56:15 crc kubenswrapper[4594]: I1129 06:56:15.933881 4594 scope.go:117] "RemoveContainer" containerID="dc8dbe12cdc8fe8bc0472ec5e2f11705190226ebe04481175e6560c74d7fe05c" Nov 29 06:56:16 crc kubenswrapper[4594]: I1129 06:56:16.280353 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5jpx"] Nov 29 06:56:16 crc kubenswrapper[4594]: I1129 06:56:16.944435 4594 scope.go:117] "RemoveContainer" containerID="5ffe7072cba89740daf4243dca1380060d3742cb8a549792f7039027015a7e5a" Nov 29 06:56:16 crc kubenswrapper[4594]: E1129 06:56:16.944660 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:56:17 crc kubenswrapper[4594]: I1129 06:56:17.953152 4594 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m5jpx" podUID="c374c29c-b8dd-473f-b004-910bee33ea02" containerName="registry-server" containerID="cri-o://2077b94ef0a9ed8e1ff8ae4524b1e2ab8d60f76ef6ac14537361b23fad1e2fce" gracePeriod=2 Nov 29 06:56:18 crc kubenswrapper[4594]: I1129 06:56:18.832583 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:18 crc kubenswrapper[4594]: I1129 06:56:18.951587 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c374c29c-b8dd-473f-b004-910bee33ea02-utilities\") pod \"c374c29c-b8dd-473f-b004-910bee33ea02\" (UID: \"c374c29c-b8dd-473f-b004-910bee33ea02\") " Nov 29 06:56:18 crc kubenswrapper[4594]: I1129 06:56:18.951881 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftn59\" (UniqueName: \"kubernetes.io/projected/c374c29c-b8dd-473f-b004-910bee33ea02-kube-api-access-ftn59\") pod \"c374c29c-b8dd-473f-b004-910bee33ea02\" (UID: \"c374c29c-b8dd-473f-b004-910bee33ea02\") " Nov 29 06:56:18 crc kubenswrapper[4594]: I1129 06:56:18.951938 4594 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c374c29c-b8dd-473f-b004-910bee33ea02-catalog-content\") pod \"c374c29c-b8dd-473f-b004-910bee33ea02\" (UID: \"c374c29c-b8dd-473f-b004-910bee33ea02\") " Nov 29 06:56:18 crc kubenswrapper[4594]: I1129 06:56:18.952363 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c374c29c-b8dd-473f-b004-910bee33ea02-utilities" (OuterVolumeSpecName: "utilities") pod "c374c29c-b8dd-473f-b004-910bee33ea02" (UID: "c374c29c-b8dd-473f-b004-910bee33ea02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:56:18 crc kubenswrapper[4594]: I1129 06:56:18.956873 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c374c29c-b8dd-473f-b004-910bee33ea02-kube-api-access-ftn59" (OuterVolumeSpecName: "kube-api-access-ftn59") pod "c374c29c-b8dd-473f-b004-910bee33ea02" (UID: "c374c29c-b8dd-473f-b004-910bee33ea02"). InnerVolumeSpecName "kube-api-access-ftn59". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 06:56:18 crc kubenswrapper[4594]: I1129 06:56:18.963426 4594 generic.go:334] "Generic (PLEG): container finished" podID="c374c29c-b8dd-473f-b004-910bee33ea02" containerID="2077b94ef0a9ed8e1ff8ae4524b1e2ab8d60f76ef6ac14537361b23fad1e2fce" exitCode=0 Nov 29 06:56:18 crc kubenswrapper[4594]: I1129 06:56:18.963467 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5jpx" event={"ID":"c374c29c-b8dd-473f-b004-910bee33ea02","Type":"ContainerDied","Data":"2077b94ef0a9ed8e1ff8ae4524b1e2ab8d60f76ef6ac14537361b23fad1e2fce"} Nov 29 06:56:18 crc kubenswrapper[4594]: I1129 06:56:18.963492 4594 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5jpx" Nov 29 06:56:18 crc kubenswrapper[4594]: I1129 06:56:18.963515 4594 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5jpx" event={"ID":"c374c29c-b8dd-473f-b004-910bee33ea02","Type":"ContainerDied","Data":"eaab74a2410569c79136b9617d07a5c57dcad3d76e89340090e8f9faa6703815"} Nov 29 06:56:18 crc kubenswrapper[4594]: I1129 06:56:18.963537 4594 scope.go:117] "RemoveContainer" containerID="2077b94ef0a9ed8e1ff8ae4524b1e2ab8d60f76ef6ac14537361b23fad1e2fce" Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.002103 4594 scope.go:117] "RemoveContainer" containerID="446397dabd2a10eebc3239b3cfd79bd4855f0f7ff85224813449566193a9d253" Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.019741 4594 scope.go:117] "RemoveContainer" containerID="6ea3d5470b0c2f871be5c34ed1e67db70aa15f836b45fd501b0ce0dfe554d712" Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.035143 4594 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c374c29c-b8dd-473f-b004-910bee33ea02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c374c29c-b8dd-473f-b004-910bee33ea02" (UID: "c374c29c-b8dd-473f-b004-910bee33ea02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.056412 4594 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c374c29c-b8dd-473f-b004-910bee33ea02-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.056444 4594 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c374c29c-b8dd-473f-b004-910bee33ea02-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.056455 4594 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftn59\" (UniqueName: \"kubernetes.io/projected/c374c29c-b8dd-473f-b004-910bee33ea02-kube-api-access-ftn59\") on node \"crc\" DevicePath \"\"" Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.057757 4594 scope.go:117] "RemoveContainer" containerID="2077b94ef0a9ed8e1ff8ae4524b1e2ab8d60f76ef6ac14537361b23fad1e2fce" Nov 29 06:56:19 crc kubenswrapper[4594]: E1129 06:56:19.058092 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2077b94ef0a9ed8e1ff8ae4524b1e2ab8d60f76ef6ac14537361b23fad1e2fce\": container with ID starting with 2077b94ef0a9ed8e1ff8ae4524b1e2ab8d60f76ef6ac14537361b23fad1e2fce not found: ID does not exist" containerID="2077b94ef0a9ed8e1ff8ae4524b1e2ab8d60f76ef6ac14537361b23fad1e2fce" Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.058143 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2077b94ef0a9ed8e1ff8ae4524b1e2ab8d60f76ef6ac14537361b23fad1e2fce"} err="failed to get container status \"2077b94ef0a9ed8e1ff8ae4524b1e2ab8d60f76ef6ac14537361b23fad1e2fce\": rpc error: code = NotFound desc = could not find container \"2077b94ef0a9ed8e1ff8ae4524b1e2ab8d60f76ef6ac14537361b23fad1e2fce\": container with ID starting with 2077b94ef0a9ed8e1ff8ae4524b1e2ab8d60f76ef6ac14537361b23fad1e2fce not found: ID does not exist" Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.058171 4594 scope.go:117] "RemoveContainer" containerID="446397dabd2a10eebc3239b3cfd79bd4855f0f7ff85224813449566193a9d253" Nov 29 06:56:19 crc kubenswrapper[4594]: E1129 06:56:19.058518 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446397dabd2a10eebc3239b3cfd79bd4855f0f7ff85224813449566193a9d253\": container with ID starting with 446397dabd2a10eebc3239b3cfd79bd4855f0f7ff85224813449566193a9d253 not found: ID does not exist" containerID="446397dabd2a10eebc3239b3cfd79bd4855f0f7ff85224813449566193a9d253" Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.058583 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446397dabd2a10eebc3239b3cfd79bd4855f0f7ff85224813449566193a9d253"} err="failed to get container status \"446397dabd2a10eebc3239b3cfd79bd4855f0f7ff85224813449566193a9d253\": rpc error: code = NotFound desc = could not find container \"446397dabd2a10eebc3239b3cfd79bd4855f0f7ff85224813449566193a9d253\": container with ID starting with 446397dabd2a10eebc3239b3cfd79bd4855f0f7ff85224813449566193a9d253 not found: ID does not exist" Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.058613 4594 scope.go:117] "RemoveContainer" containerID="6ea3d5470b0c2f871be5c34ed1e67db70aa15f836b45fd501b0ce0dfe554d712" Nov 29 06:56:19 crc kubenswrapper[4594]: E1129 06:56:19.058918 4594 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea3d5470b0c2f871be5c34ed1e67db70aa15f836b45fd501b0ce0dfe554d712\": container with ID starting with 6ea3d5470b0c2f871be5c34ed1e67db70aa15f836b45fd501b0ce0dfe554d712 not found: ID does not exist" containerID="6ea3d5470b0c2f871be5c34ed1e67db70aa15f836b45fd501b0ce0dfe554d712" Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.058962 4594 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea3d5470b0c2f871be5c34ed1e67db70aa15f836b45fd501b0ce0dfe554d712"} err="failed to get container status \"6ea3d5470b0c2f871be5c34ed1e67db70aa15f836b45fd501b0ce0dfe554d712\": rpc error: code = NotFound desc = could not find container \"6ea3d5470b0c2f871be5c34ed1e67db70aa15f836b45fd501b0ce0dfe554d712\": container with ID starting with 6ea3d5470b0c2f871be5c34ed1e67db70aa15f836b45fd501b0ce0dfe554d712 not found: ID does not exist" Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.307399 4594 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5jpx"] Nov 29 06:56:19 crc kubenswrapper[4594]: I1129 06:56:19.316927 4594 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m5jpx"] Nov 29 06:56:20 crc kubenswrapper[4594]: I1129 06:56:20.091588 4594 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c374c29c-b8dd-473f-b004-910bee33ea02" path="/var/lib/kubelet/pods/c374c29c-b8dd-473f-b004-910bee33ea02/volumes" Nov 29 06:56:29 crc kubenswrapper[4594]: I1129 06:56:29.083472 4594 scope.go:117] "RemoveContainer" containerID="5ffe7072cba89740daf4243dca1380060d3742cb8a549792f7039027015a7e5a" Nov 29 06:56:29 crc kubenswrapper[4594]: E1129 06:56:29.085220 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:56:41 crc kubenswrapper[4594]: I1129 06:56:41.083402 4594 scope.go:117] "RemoveContainer" containerID="5ffe7072cba89740daf4243dca1380060d3742cb8a549792f7039027015a7e5a" Nov 29 06:56:41 crc kubenswrapper[4594]: E1129 06:56:41.084295 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:56:54 crc kubenswrapper[4594]: I1129 06:56:54.083954 4594 scope.go:117] "RemoveContainer" containerID="5ffe7072cba89740daf4243dca1380060d3742cb8a549792f7039027015a7e5a" Nov 29 06:56:54 crc kubenswrapper[4594]: E1129 06:56:54.084706 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5" Nov 29 06:57:07 crc kubenswrapper[4594]: I1129 06:57:07.083545 4594 scope.go:117] "RemoveContainer" containerID="5ffe7072cba89740daf4243dca1380060d3742cb8a549792f7039027015a7e5a" Nov 29 06:57:07 crc kubenswrapper[4594]: E1129 06:57:07.084189 4594 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ggz4n_openshift-machine-config-operator(509844d4-3ee1-4059-84bb-6e90200f50c5)\"" pod="openshift-machine-config-operator/machine-config-daemon-ggz4n" podUID="509844d4-3ee1-4059-84bb-6e90200f50c5"